Jan 15 01:59:27.676577 kernel: Linux version 6.12.65-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT_DYNAMIC Wed Jan 14 22:02:13 -00 2026 Jan 15 01:59:27.676604 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=1042e64ca7212ba2a277cb872bdf1dc4e195c9fb8110078c443b3efbd2488cb9 Jan 15 01:59:27.676614 kernel: BIOS-provided physical RAM map: Jan 15 01:59:27.676621 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 15 01:59:27.676627 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 15 01:59:27.676633 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 15 01:59:27.676642 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Jan 15 01:59:27.676649 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 15 01:59:27.676655 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 15 01:59:27.676661 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 15 01:59:27.676668 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000007e93efff] usable Jan 15 01:59:27.676674 kernel: BIOS-e820: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 15 01:59:27.676681 kernel: BIOS-e820: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 15 01:59:27.676687 kernel: BIOS-e820: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 15 01:59:27.676696 kernel: BIOS-e820: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 15 01:59:27.676703 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 15 01:59:27.676710 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 15 01:59:27.676717 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 15 01:59:27.676723 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 15 01:59:27.676730 kernel: BIOS-e820: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 15 01:59:27.676738 kernel: BIOS-e820: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 15 01:59:27.676744 kernel: BIOS-e820: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 15 01:59:27.676751 kernel: BIOS-e820: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 15 01:59:27.676757 kernel: BIOS-e820: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 15 01:59:27.676764 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 15 01:59:27.676770 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 15 01:59:27.676777 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 15 01:59:27.676783 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 15 01:59:27.676790 kernel: NX (Execute Disable) protection: active Jan 15 01:59:27.676796 kernel: APIC: Static calls initialized Jan 15 01:59:27.676803 kernel: e820: update [mem 0x7df7f018-0x7df88a57] usable ==> usable Jan 15 01:59:27.676811 kernel: e820: update [mem 0x7df57018-0x7df7e457] usable ==> usable Jan 15 01:59:27.676818 kernel: extended physical RAM map: Jan 15 01:59:27.676825 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Jan 15 01:59:27.676831 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Jan 15 01:59:27.676838 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jan 15 01:59:27.676845 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Jan 15 01:59:27.676851 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jan 15 01:59:27.676858 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Jan 15 01:59:27.676865 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jan 15 01:59:27.676875 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000007df57017] usable Jan 15 01:59:27.676883 kernel: reserve setup_data: [mem 0x000000007df57018-0x000000007df7e457] usable Jan 15 01:59:27.676890 kernel: reserve setup_data: [mem 0x000000007df7e458-0x000000007df7f017] usable Jan 15 01:59:27.676897 kernel: reserve setup_data: [mem 0x000000007df7f018-0x000000007df88a57] usable Jan 15 01:59:27.676905 kernel: reserve setup_data: [mem 0x000000007df88a58-0x000000007e93efff] usable Jan 15 01:59:27.676912 kernel: reserve setup_data: [mem 0x000000007e93f000-0x000000007e9fffff] reserved Jan 15 01:59:27.676919 kernel: reserve setup_data: [mem 0x000000007ea00000-0x000000007ec70fff] usable Jan 15 01:59:27.676926 kernel: reserve setup_data: [mem 0x000000007ec71000-0x000000007ed84fff] reserved Jan 15 01:59:27.676933 kernel: reserve setup_data: [mem 0x000000007ed85000-0x000000007f8ecfff] usable Jan 15 01:59:27.676940 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Jan 15 01:59:27.676947 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Jan 15 01:59:27.676954 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Jan 15 01:59:27.676961 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007feaefff] usable Jan 15 01:59:27.676968 kernel: reserve setup_data: [mem 0x000000007feaf000-0x000000007feb2fff] reserved Jan 15 01:59:27.676975 kernel: reserve setup_data: [mem 0x000000007feb3000-0x000000007feb4fff] ACPI NVS Jan 15 01:59:27.676983 kernel: reserve setup_data: [mem 0x000000007feb5000-0x000000007feebfff] usable Jan 15 01:59:27.676990 kernel: reserve setup_data: [mem 0x000000007feec000-0x000000007ff6ffff] reserved Jan 15 01:59:27.676997 kernel: reserve setup_data: [mem 0x000000007ff70000-0x000000007fffffff] ACPI NVS Jan 15 01:59:27.677004 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jan 15 01:59:27.677011 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 15 01:59:27.677018 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jan 15 01:59:27.677025 kernel: reserve setup_data: [mem 0x0000000100000000-0x000000017fffffff] usable Jan 15 01:59:27.677032 kernel: efi: EFI v2.7 by EDK II Jan 15 01:59:27.677039 kernel: efi: SMBIOS=0x7f972000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7dfd8018 RNG=0x7fb72018 Jan 15 01:59:27.677046 kernel: random: crng init done Jan 15 01:59:27.677053 kernel: efi: Remove mem139: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Jan 15 01:59:27.677071 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Jan 15 01:59:27.677078 kernel: secureboot: Secure boot disabled Jan 15 01:59:27.677085 kernel: SMBIOS 2.8 present. Jan 15 01:59:27.677092 kernel: DMI: STACKIT Cloud OpenStack Nova/Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Jan 15 01:59:27.677099 kernel: DMI: Memory slots populated: 1/1 Jan 15 01:59:27.677106 kernel: Hypervisor detected: KVM Jan 15 01:59:27.677114 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 15 01:59:27.677121 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 15 01:59:27.677128 kernel: kvm-clock: using sched offset of 5924520962 cycles Jan 15 01:59:27.677135 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 15 01:59:27.677145 kernel: tsc: Detected 2294.608 MHz processor Jan 15 01:59:27.677164 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 15 01:59:27.677172 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 15 01:59:27.677180 kernel: last_pfn = 0x180000 max_arch_pfn = 0x10000000000 Jan 15 01:59:27.677187 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Jan 15 01:59:27.677195 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 15 01:59:27.677202 kernel: last_pfn = 0x7feec max_arch_pfn = 0x10000000000 Jan 15 01:59:27.677209 kernel: Using GB pages for direct mapping Jan 15 01:59:27.677219 kernel: ACPI: Early table checksum verification disabled Jan 15 01:59:27.677226 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Jan 15 01:59:27.677234 kernel: ACPI: XSDT 0x000000007FB7D0E8 00004C (v01 BOCHS BXPC 00000001 01000013) Jan 15 01:59:27.677241 kernel: ACPI: FACP 0x000000007FB77000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 01:59:27.677249 kernel: ACPI: DSDT 0x000000007FB78000 00423C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 01:59:27.677256 kernel: ACPI: FACS 0x000000007FBDD000 000040 Jan 15 01:59:27.677264 kernel: ACPI: APIC 0x000000007FB76000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 01:59:27.677273 kernel: ACPI: MCFG 0x000000007FB75000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 01:59:27.677280 kernel: ACPI: WAET 0x000000007FB74000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 15 01:59:27.677288 kernel: ACPI: BGRT 0x000000007FB73000 000038 (v01 INTEL EDK2 00000002 01000013) Jan 15 01:59:27.677295 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb77000-0x7fb770f3] Jan 15 01:59:27.677302 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb78000-0x7fb7c23b] Jan 15 01:59:27.677310 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Jan 15 01:59:27.677317 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb76000-0x7fb7607f] Jan 15 01:59:27.677326 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb75000-0x7fb7503b] Jan 15 01:59:27.677333 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb74000-0x7fb74027] Jan 15 01:59:27.677340 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb73000-0x7fb73037] Jan 15 01:59:27.677347 kernel: No NUMA configuration found Jan 15 01:59:27.677355 kernel: Faking a node at [mem 0x0000000000000000-0x000000017fffffff] Jan 15 01:59:27.677362 kernel: NODE_DATA(0) allocated [mem 0x17fff8dc0-0x17fffffff] Jan 15 01:59:27.677370 kernel: Zone ranges: Jan 15 01:59:27.677377 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 15 01:59:27.677386 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Jan 15 01:59:27.677393 kernel: Normal [mem 0x0000000100000000-0x000000017fffffff] Jan 15 01:59:27.677401 kernel: Device empty Jan 15 01:59:27.677408 kernel: Movable zone start for each node Jan 15 01:59:27.677415 kernel: Early memory node ranges Jan 15 01:59:27.677423 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Jan 15 01:59:27.677430 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Jan 15 01:59:27.677438 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Jan 15 01:59:27.677446 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Jan 15 01:59:27.677453 kernel: node 0: [mem 0x0000000000900000-0x000000007e93efff] Jan 15 01:59:27.677460 kernel: node 0: [mem 0x000000007ea00000-0x000000007ec70fff] Jan 15 01:59:27.677468 kernel: node 0: [mem 0x000000007ed85000-0x000000007f8ecfff] Jan 15 01:59:27.677481 kernel: node 0: [mem 0x000000007fbff000-0x000000007feaefff] Jan 15 01:59:27.677490 kernel: node 0: [mem 0x000000007feb5000-0x000000007feebfff] Jan 15 01:59:27.677497 kernel: node 0: [mem 0x0000000100000000-0x000000017fffffff] Jan 15 01:59:27.677505 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000017fffffff] Jan 15 01:59:27.677513 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 15 01:59:27.677522 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Jan 15 01:59:27.677530 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Jan 15 01:59:27.677539 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 15 01:59:27.677547 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Jan 15 01:59:27.677556 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Jan 15 01:59:27.677564 kernel: On node 0, zone DMA32: 276 pages in unavailable ranges Jan 15 01:59:27.677572 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Jan 15 01:59:27.677580 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Jan 15 01:59:27.677588 kernel: On node 0, zone Normal: 276 pages in unavailable ranges Jan 15 01:59:27.677596 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 15 01:59:27.677604 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 15 01:59:27.677613 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 15 01:59:27.677622 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 15 01:59:27.677630 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 15 01:59:27.677638 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 15 01:59:27.677646 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 15 01:59:27.677653 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 15 01:59:27.677661 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 15 01:59:27.677671 kernel: TSC deadline timer available Jan 15 01:59:27.677679 kernel: CPU topo: Max. logical packages: 2 Jan 15 01:59:27.677687 kernel: CPU topo: Max. logical dies: 2 Jan 15 01:59:27.677695 kernel: CPU topo: Max. dies per package: 1 Jan 15 01:59:27.677703 kernel: CPU topo: Max. threads per core: 1 Jan 15 01:59:27.677710 kernel: CPU topo: Num. cores per package: 1 Jan 15 01:59:27.677719 kernel: CPU topo: Num. threads per package: 1 Jan 15 01:59:27.677726 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jan 15 01:59:27.677736 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 15 01:59:27.677744 kernel: kvm-guest: KVM setup pv remote TLB flush Jan 15 01:59:27.677752 kernel: kvm-guest: setup PV sched yield Jan 15 01:59:27.677760 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Jan 15 01:59:27.677768 kernel: Booting paravirtualized kernel on KVM Jan 15 01:59:27.677776 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 15 01:59:27.677784 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jan 15 01:59:27.677793 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jan 15 01:59:27.677802 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jan 15 01:59:27.677810 kernel: pcpu-alloc: [0] 0 1 Jan 15 01:59:27.677817 kernel: kvm-guest: PV spinlocks enabled Jan 15 01:59:27.677825 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jan 15 01:59:27.677834 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=1042e64ca7212ba2a277cb872bdf1dc4e195c9fb8110078c443b3efbd2488cb9 Jan 15 01:59:27.677843 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 15 01:59:27.677852 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 15 01:59:27.677860 kernel: Fallback order for Node 0: 0 Jan 15 01:59:27.677868 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1046694 Jan 15 01:59:27.677876 kernel: Policy zone: Normal Jan 15 01:59:27.677884 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 15 01:59:27.677892 kernel: software IO TLB: area num 2. Jan 15 01:59:27.677900 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 15 01:59:27.677910 kernel: ftrace: allocating 40097 entries in 157 pages Jan 15 01:59:27.677918 kernel: ftrace: allocated 157 pages with 5 groups Jan 15 01:59:27.677926 kernel: Dynamic Preempt: voluntary Jan 15 01:59:27.677934 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 15 01:59:27.677942 kernel: rcu: RCU event tracing is enabled. Jan 15 01:59:27.677951 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 15 01:59:27.677959 kernel: Trampoline variant of Tasks RCU enabled. Jan 15 01:59:27.677968 kernel: Rude variant of Tasks RCU enabled. Jan 15 01:59:27.677976 kernel: Tracing variant of Tasks RCU enabled. Jan 15 01:59:27.677984 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 15 01:59:27.677992 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 15 01:59:27.678000 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 15 01:59:27.678008 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 15 01:59:27.678016 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 15 01:59:27.678024 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jan 15 01:59:27.678033 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 15 01:59:27.678041 kernel: Console: colour dummy device 80x25 Jan 15 01:59:27.678049 kernel: printk: legacy console [tty0] enabled Jan 15 01:59:27.678073 kernel: printk: legacy console [ttyS0] enabled Jan 15 01:59:27.678081 kernel: ACPI: Core revision 20240827 Jan 15 01:59:27.678090 kernel: APIC: Switch to symmetric I/O mode setup Jan 15 01:59:27.678098 kernel: x2apic enabled Jan 15 01:59:27.678107 kernel: APIC: Switched APIC routing to: physical x2apic Jan 15 01:59:27.678815 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Jan 15 01:59:27.678825 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Jan 15 01:59:27.678833 kernel: kvm-guest: setup PV IPIs Jan 15 01:59:27.678841 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Jan 15 01:59:27.678850 kernel: Calibrating delay loop (skipped) preset value.. 4589.21 BogoMIPS (lpj=2294608) Jan 15 01:59:27.678858 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 15 01:59:27.678870 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Jan 15 01:59:27.678878 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Jan 15 01:59:27.678886 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 15 01:59:27.678893 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Jan 15 01:59:27.678901 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 15 01:59:27.678915 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 15 01:59:27.678923 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 15 01:59:27.678931 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 15 01:59:27.678938 kernel: TAA: Mitigation: Clear CPU buffers Jan 15 01:59:27.678945 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Jan 15 01:59:27.678955 kernel: active return thunk: its_return_thunk Jan 15 01:59:27.678962 kernel: ITS: Mitigation: Aligned branch/return thunks Jan 15 01:59:27.678970 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 15 01:59:27.678978 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 15 01:59:27.678985 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 15 01:59:27.678993 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Jan 15 01:59:27.679000 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Jan 15 01:59:27.679008 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Jan 15 01:59:27.679015 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Jan 15 01:59:27.679023 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 15 01:59:27.679032 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Jan 15 01:59:27.679039 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Jan 15 01:59:27.679047 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Jan 15 01:59:27.679071 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Jan 15 01:59:27.679294 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Jan 15 01:59:27.679303 kernel: Freeing SMP alternatives memory: 32K Jan 15 01:59:27.679311 kernel: pid_max: default: 32768 minimum: 301 Jan 15 01:59:27.679319 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 15 01:59:27.679326 kernel: landlock: Up and running. Jan 15 01:59:27.679334 kernel: SELinux: Initializing. Jan 15 01:59:27.679341 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 15 01:59:27.679352 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 15 01:59:27.679359 kernel: smpboot: CPU0: Intel(R) Xeon(R) Silver 4316 CPU @ 2.30GHz (family: 0x6, model: 0x6a, stepping: 0x6) Jan 15 01:59:27.679367 kernel: Performance Events: PEBS fmt0-, Icelake events, full-width counters, Intel PMU driver. Jan 15 01:59:27.679375 kernel: ... version: 2 Jan 15 01:59:27.679383 kernel: ... bit width: 48 Jan 15 01:59:27.679391 kernel: ... generic registers: 8 Jan 15 01:59:27.679400 kernel: ... value mask: 0000ffffffffffff Jan 15 01:59:27.679407 kernel: ... max period: 00007fffffffffff Jan 15 01:59:27.679417 kernel: ... fixed-purpose events: 3 Jan 15 01:59:27.679425 kernel: ... event mask: 00000007000000ff Jan 15 01:59:27.679433 kernel: signal: max sigframe size: 3632 Jan 15 01:59:27.679441 kernel: rcu: Hierarchical SRCU implementation. Jan 15 01:59:27.679450 kernel: rcu: Max phase no-delay instances is 400. Jan 15 01:59:27.679458 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 15 01:59:27.679466 kernel: smp: Bringing up secondary CPUs ... Jan 15 01:59:27.679475 kernel: smpboot: x86: Booting SMP configuration: Jan 15 01:59:27.679483 kernel: .... node #0, CPUs: #1 Jan 15 01:59:27.679491 kernel: smp: Brought up 1 node, 2 CPUs Jan 15 01:59:27.679499 kernel: smpboot: Total of 2 processors activated (9178.43 BogoMIPS) Jan 15 01:59:27.679507 kernel: Memory: 3971816K/4186776K available (14336K kernel code, 2445K rwdata, 29896K rodata, 15432K init, 2608K bss, 210080K reserved, 0K cma-reserved) Jan 15 01:59:27.679515 kernel: devtmpfs: initialized Jan 15 01:59:27.679523 kernel: x86/mm: Memory block size: 128MB Jan 15 01:59:27.679532 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Jan 15 01:59:27.679540 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Jan 15 01:59:27.679548 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Jan 15 01:59:27.679556 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Jan 15 01:59:27.679564 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feb3000-0x7feb4fff] (8192 bytes) Jan 15 01:59:27.679572 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7ff70000-0x7fffffff] (589824 bytes) Jan 15 01:59:27.679580 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 15 01:59:27.679589 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 15 01:59:27.679597 kernel: pinctrl core: initialized pinctrl subsystem Jan 15 01:59:27.679605 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 15 01:59:27.679613 kernel: audit: initializing netlink subsys (disabled) Jan 15 01:59:27.679621 kernel: audit: type=2000 audit(1768442363.767:1): state=initialized audit_enabled=0 res=1 Jan 15 01:59:27.679629 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 15 01:59:27.679636 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 15 01:59:27.679645 kernel: cpuidle: using governor menu Jan 15 01:59:27.679653 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 15 01:59:27.679661 kernel: dca service started, version 1.12.1 Jan 15 01:59:27.679669 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Jan 15 01:59:27.679677 kernel: PCI: Using configuration type 1 for base access Jan 15 01:59:27.679685 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 15 01:59:27.679693 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 15 01:59:27.679703 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 15 01:59:27.679711 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 15 01:59:27.679719 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 15 01:59:27.679727 kernel: ACPI: Added _OSI(Module Device) Jan 15 01:59:27.679734 kernel: ACPI: Added _OSI(Processor Device) Jan 15 01:59:27.679743 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 15 01:59:27.679751 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 15 01:59:27.679758 kernel: ACPI: Interpreter enabled Jan 15 01:59:27.679768 kernel: ACPI: PM: (supports S0 S3 S5) Jan 15 01:59:27.679775 kernel: ACPI: Using IOAPIC for interrupt routing Jan 15 01:59:27.679783 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 15 01:59:27.679791 kernel: PCI: Using E820 reservations for host bridge windows Jan 15 01:59:27.679799 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 15 01:59:27.679807 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 15 01:59:27.679964 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 15 01:59:27.681396 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jan 15 01:59:27.681512 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jan 15 01:59:27.681523 kernel: PCI host bridge to bus 0000:00 Jan 15 01:59:27.681625 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 15 01:59:27.681716 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 15 01:59:27.681808 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 15 01:59:27.681896 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Jan 15 01:59:27.681985 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Jan 15 01:59:27.682826 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x38e800003fff window] Jan 15 01:59:27.682935 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 15 01:59:27.683052 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jan 15 01:59:27.683647 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jan 15 01:59:27.683751 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Jan 15 01:59:27.683853 kernel: pci 0000:00:01.0: BAR 2 [mem 0x38e800000000-0x38e800003fff 64bit pref] Jan 15 01:59:27.683948 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8439e000-0x8439efff] Jan 15 01:59:27.684043 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Jan 15 01:59:27.688635 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 15 01:59:27.688746 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.688846 kernel: pci 0000:00:02.0: BAR 0 [mem 0x8439d000-0x8439dfff] Jan 15 01:59:27.688947 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 15 01:59:27.689045 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 15 01:59:27.689194 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 15 01:59:27.689298 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 15 01:59:27.689405 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.689502 kernel: pci 0000:00:02.1: BAR 0 [mem 0x8439c000-0x8439cfff] Jan 15 01:59:27.689599 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 15 01:59:27.689694 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 15 01:59:27.689793 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 15 01:59:27.689897 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.689994 kernel: pci 0000:00:02.2: BAR 0 [mem 0x8439b000-0x8439bfff] Jan 15 01:59:27.691750 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 15 01:59:27.691864 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 15 01:59:27.691961 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 15 01:59:27.692081 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.692184 kernel: pci 0000:00:02.3: BAR 0 [mem 0x8439a000-0x8439afff] Jan 15 01:59:27.692283 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 15 01:59:27.692380 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 15 01:59:27.692477 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 15 01:59:27.692587 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.692686 kernel: pci 0000:00:02.4: BAR 0 [mem 0x84399000-0x84399fff] Jan 15 01:59:27.692783 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 15 01:59:27.692878 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 15 01:59:27.692973 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 15 01:59:27.693086 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.693196 kernel: pci 0000:00:02.5: BAR 0 [mem 0x84398000-0x84398fff] Jan 15 01:59:27.693295 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 15 01:59:27.693391 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 15 01:59:27.693486 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 15 01:59:27.693588 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.693685 kernel: pci 0000:00:02.6: BAR 0 [mem 0x84397000-0x84397fff] Jan 15 01:59:27.693783 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 15 01:59:27.693879 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 15 01:59:27.693973 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 15 01:59:27.700075 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.700189 kernel: pci 0000:00:02.7: BAR 0 [mem 0x84396000-0x84396fff] Jan 15 01:59:27.700291 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 15 01:59:27.700414 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 15 01:59:27.700828 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 15 01:59:27.700947 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.701049 kernel: pci 0000:00:03.0: BAR 0 [mem 0x84395000-0x84395fff] Jan 15 01:59:27.701305 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 15 01:59:27.701408 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 15 01:59:27.701513 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 15 01:59:27.701624 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.701736 kernel: pci 0000:00:03.1: BAR 0 [mem 0x84394000-0x84394fff] Jan 15 01:59:27.701836 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 15 01:59:27.701935 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 15 01:59:27.702033 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 15 01:59:27.702150 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.702249 kernel: pci 0000:00:03.2: BAR 0 [mem 0x84393000-0x84393fff] Jan 15 01:59:27.702347 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 15 01:59:27.702445 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 15 01:59:27.702543 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 15 01:59:27.702647 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.702749 kernel: pci 0000:00:03.3: BAR 0 [mem 0x84392000-0x84392fff] Jan 15 01:59:27.702848 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 15 01:59:27.702945 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 15 01:59:27.703045 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 15 01:59:27.705222 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.705335 kernel: pci 0000:00:03.4: BAR 0 [mem 0x84391000-0x84391fff] Jan 15 01:59:27.705436 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 15 01:59:27.705534 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 15 01:59:27.705632 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 15 01:59:27.705742 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.705840 kernel: pci 0000:00:03.5: BAR 0 [mem 0x84390000-0x84390fff] Jan 15 01:59:27.705939 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 15 01:59:27.706037 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 15 01:59:27.706144 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 15 01:59:27.706248 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.706349 kernel: pci 0000:00:03.6: BAR 0 [mem 0x8438f000-0x8438ffff] Jan 15 01:59:27.706447 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 15 01:59:27.706544 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 15 01:59:27.706641 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 15 01:59:27.706745 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.706843 kernel: pci 0000:00:03.7: BAR 0 [mem 0x8438e000-0x8438efff] Jan 15 01:59:27.706943 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 15 01:59:27.707039 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 15 01:59:27.709211 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 15 01:59:27.709390 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.709502 kernel: pci 0000:00:04.0: BAR 0 [mem 0x8438d000-0x8438dfff] Jan 15 01:59:27.709599 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 15 01:59:27.709700 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 15 01:59:27.709795 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 15 01:59:27.709897 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.709993 kernel: pci 0000:00:04.1: BAR 0 [mem 0x8438c000-0x8438cfff] Jan 15 01:59:27.710346 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 15 01:59:27.710449 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 15 01:59:27.710547 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 15 01:59:27.710651 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.710747 kernel: pci 0000:00:04.2: BAR 0 [mem 0x8438b000-0x8438bfff] Jan 15 01:59:27.710842 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 15 01:59:27.710936 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 15 01:59:27.711031 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 15 01:59:27.711181 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.711277 kernel: pci 0000:00:04.3: BAR 0 [mem 0x8438a000-0x8438afff] Jan 15 01:59:27.711371 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 15 01:59:27.711465 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 15 01:59:27.711559 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 15 01:59:27.711665 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.711763 kernel: pci 0000:00:04.4: BAR 0 [mem 0x84389000-0x84389fff] Jan 15 01:59:27.711858 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 15 01:59:27.711952 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 15 01:59:27.712046 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 15 01:59:27.712165 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.712264 kernel: pci 0000:00:04.5: BAR 0 [mem 0x84388000-0x84388fff] Jan 15 01:59:27.712358 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 15 01:59:27.712452 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 15 01:59:27.712546 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 15 01:59:27.712647 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.712747 kernel: pci 0000:00:04.6: BAR 0 [mem 0x84387000-0x84387fff] Jan 15 01:59:27.712842 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 15 01:59:27.712936 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 15 01:59:27.713030 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 15 01:59:27.713143 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.713254 kernel: pci 0000:00:04.7: BAR 0 [mem 0x84386000-0x84386fff] Jan 15 01:59:27.713348 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 15 01:59:27.713442 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 15 01:59:27.713538 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 15 01:59:27.713644 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.713741 kernel: pci 0000:00:05.0: BAR 0 [mem 0x84385000-0x84385fff] Jan 15 01:59:27.713846 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 15 01:59:27.713941 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 15 01:59:27.714035 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 15 01:59:27.714153 kernel: pci 0000:00:05.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.714248 kernel: pci 0000:00:05.1: BAR 0 [mem 0x84384000-0x84384fff] Jan 15 01:59:27.714346 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 15 01:59:27.714441 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 15 01:59:27.714535 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 15 01:59:27.714636 kernel: pci 0000:00:05.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.714731 kernel: pci 0000:00:05.2: BAR 0 [mem 0x84383000-0x84383fff] Jan 15 01:59:27.714827 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 15 01:59:27.714924 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 15 01:59:27.715018 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 15 01:59:27.715394 kernel: pci 0000:00:05.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.715500 kernel: pci 0000:00:05.3: BAR 0 [mem 0x84382000-0x84382fff] Jan 15 01:59:27.715595 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 15 01:59:27.715690 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 15 01:59:27.715788 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 15 01:59:27.715891 kernel: pci 0000:00:05.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 15 01:59:27.715986 kernel: pci 0000:00:05.4: BAR 0 [mem 0x84381000-0x84381fff] Jan 15 01:59:27.716091 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 15 01:59:27.716187 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 15 01:59:27.716281 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 15 01:59:27.716389 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jan 15 01:59:27.716484 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 15 01:59:27.716586 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jan 15 01:59:27.716682 kernel: pci 0000:00:1f.2: BAR 4 [io 0x7040-0x705f] Jan 15 01:59:27.716777 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x84380000-0x84380fff] Jan 15 01:59:27.716878 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jan 15 01:59:27.717001 kernel: pci 0000:00:1f.3: BAR 4 [io 0x7000-0x703f] Jan 15 01:59:27.717127 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Jan 15 01:59:27.717773 kernel: pci 0000:01:00.0: BAR 0 [mem 0x84200000-0x842000ff 64bit] Jan 15 01:59:27.717884 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 15 01:59:27.717983 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 15 01:59:27.718096 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 15 01:59:27.718203 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 15 01:59:27.718301 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 15 01:59:27.718409 kernel: pci_bus 0000:02: extended config space not accessible Jan 15 01:59:27.718421 kernel: acpiphp: Slot [1] registered Jan 15 01:59:27.718430 kernel: acpiphp: Slot [0] registered Jan 15 01:59:27.718439 kernel: acpiphp: Slot [2] registered Jan 15 01:59:27.718450 kernel: acpiphp: Slot [3] registered Jan 15 01:59:27.718458 kernel: acpiphp: Slot [4] registered Jan 15 01:59:27.718466 kernel: acpiphp: Slot [5] registered Jan 15 01:59:27.718474 kernel: acpiphp: Slot [6] registered Jan 15 01:59:27.718482 kernel: acpiphp: Slot [7] registered Jan 15 01:59:27.718490 kernel: acpiphp: Slot [8] registered Jan 15 01:59:27.718498 kernel: acpiphp: Slot [9] registered Jan 15 01:59:27.718510 kernel: acpiphp: Slot [10] registered Jan 15 01:59:27.718518 kernel: acpiphp: Slot [11] registered Jan 15 01:59:27.718526 kernel: acpiphp: Slot [12] registered Jan 15 01:59:27.718534 kernel: acpiphp: Slot [13] registered Jan 15 01:59:27.718542 kernel: acpiphp: Slot [14] registered Jan 15 01:59:27.718550 kernel: acpiphp: Slot [15] registered Jan 15 01:59:27.718558 kernel: acpiphp: Slot [16] registered Jan 15 01:59:27.718567 kernel: acpiphp: Slot [17] registered Jan 15 01:59:27.718577 kernel: acpiphp: Slot [18] registered Jan 15 01:59:27.718585 kernel: acpiphp: Slot [19] registered Jan 15 01:59:27.718593 kernel: acpiphp: Slot [20] registered Jan 15 01:59:27.718602 kernel: acpiphp: Slot [21] registered Jan 15 01:59:27.718610 kernel: acpiphp: Slot [22] registered Jan 15 01:59:27.718618 kernel: acpiphp: Slot [23] registered Jan 15 01:59:27.718626 kernel: acpiphp: Slot [24] registered Jan 15 01:59:27.718636 kernel: acpiphp: Slot [25] registered Jan 15 01:59:27.718644 kernel: acpiphp: Slot [26] registered Jan 15 01:59:27.718653 kernel: acpiphp: Slot [27] registered Jan 15 01:59:27.718661 kernel: acpiphp: Slot [28] registered Jan 15 01:59:27.718669 kernel: acpiphp: Slot [29] registered Jan 15 01:59:27.718677 kernel: acpiphp: Slot [30] registered Jan 15 01:59:27.718685 kernel: acpiphp: Slot [31] registered Jan 15 01:59:27.718794 kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint Jan 15 01:59:27.718898 kernel: pci 0000:02:01.0: BAR 4 [io 0x6000-0x601f] Jan 15 01:59:27.718996 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 15 01:59:27.719007 kernel: acpiphp: Slot [0-2] registered Jan 15 01:59:27.719130 kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 15 01:59:27.719231 kernel: pci 0000:03:00.0: BAR 1 [mem 0x83e00000-0x83e00fff] Jan 15 01:59:27.719333 kernel: pci 0000:03:00.0: BAR 4 [mem 0x380800000000-0x380800003fff 64bit pref] Jan 15 01:59:27.719432 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 15 01:59:27.719530 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 15 01:59:27.719541 kernel: acpiphp: Slot [0-3] registered Jan 15 01:59:27.719644 kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 15 01:59:27.719744 kernel: pci 0000:04:00.0: BAR 1 [mem 0x83c00000-0x83c00fff] Jan 15 01:59:27.719844 kernel: pci 0000:04:00.0: BAR 4 [mem 0x381000000000-0x381000003fff 64bit pref] Jan 15 01:59:27.719940 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 15 01:59:27.719950 kernel: acpiphp: Slot [0-4] registered Jan 15 01:59:27.720074 kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 15 01:59:27.720176 kernel: pci 0000:05:00.0: BAR 4 [mem 0x381800000000-0x381800003fff 64bit pref] Jan 15 01:59:27.720272 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 15 01:59:27.720286 kernel: acpiphp: Slot [0-5] registered Jan 15 01:59:27.720390 kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 15 01:59:27.720488 kernel: pci 0000:06:00.0: BAR 1 [mem 0x83800000-0x83800fff] Jan 15 01:59:27.720586 kernel: pci 0000:06:00.0: BAR 4 [mem 0x382000000000-0x382000003fff 64bit pref] Jan 15 01:59:27.720681 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 15 01:59:27.720692 kernel: acpiphp: Slot [0-6] registered Jan 15 01:59:27.720789 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 15 01:59:27.720800 kernel: acpiphp: Slot [0-7] registered Jan 15 01:59:27.720893 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 15 01:59:27.720904 kernel: acpiphp: Slot [0-8] registered Jan 15 01:59:27.720998 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 15 01:59:27.721009 kernel: acpiphp: Slot [0-9] registered Jan 15 01:59:27.721122 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 15 01:59:27.721133 kernel: acpiphp: Slot [0-10] registered Jan 15 01:59:27.721240 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 15 01:59:27.721252 kernel: acpiphp: Slot [0-11] registered Jan 15 01:59:27.721346 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 15 01:59:27.721357 kernel: acpiphp: Slot [0-12] registered Jan 15 01:59:27.721451 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 15 01:59:27.721464 kernel: acpiphp: Slot [0-13] registered Jan 15 01:59:27.721558 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 15 01:59:27.721569 kernel: acpiphp: Slot [0-14] registered Jan 15 01:59:27.721663 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 15 01:59:27.721674 kernel: acpiphp: Slot [0-15] registered Jan 15 01:59:27.721769 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 15 01:59:27.721781 kernel: acpiphp: Slot [0-16] registered Jan 15 01:59:27.721875 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 15 01:59:27.721886 kernel: acpiphp: Slot [0-17] registered Jan 15 01:59:27.721980 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 15 01:59:27.721991 kernel: acpiphp: Slot [0-18] registered Jan 15 01:59:27.722096 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 15 01:59:27.722107 kernel: acpiphp: Slot [0-19] registered Jan 15 01:59:27.722205 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 15 01:59:27.722216 kernel: acpiphp: Slot [0-20] registered Jan 15 01:59:27.722309 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 15 01:59:27.722320 kernel: acpiphp: Slot [0-21] registered Jan 15 01:59:27.722413 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 15 01:59:27.722423 kernel: acpiphp: Slot [0-22] registered Jan 15 01:59:27.722520 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 15 01:59:27.722530 kernel: acpiphp: Slot [0-23] registered Jan 15 01:59:27.722624 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 15 01:59:27.722635 kernel: acpiphp: Slot [0-24] registered Jan 15 01:59:27.722728 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 15 01:59:27.722739 kernel: acpiphp: Slot [0-25] registered Jan 15 01:59:27.722834 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 15 01:59:27.722845 kernel: acpiphp: Slot [0-26] registered Jan 15 01:59:27.722941 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 15 01:59:27.722952 kernel: acpiphp: Slot [0-27] registered Jan 15 01:59:27.723046 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 15 01:59:27.723063 kernel: acpiphp: Slot [0-28] registered Jan 15 01:59:27.723166 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 15 01:59:27.723179 kernel: acpiphp: Slot [0-29] registered Jan 15 01:59:27.723273 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 15 01:59:27.723284 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 15 01:59:27.723293 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 15 01:59:27.723301 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 15 01:59:27.723310 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 15 01:59:27.723318 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 15 01:59:27.723332 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 15 01:59:27.723340 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 15 01:59:27.723348 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 15 01:59:27.723356 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 15 01:59:27.723365 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 15 01:59:27.723373 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 15 01:59:27.723381 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 15 01:59:27.723391 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 15 01:59:27.723399 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 15 01:59:27.723408 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 15 01:59:27.723416 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 15 01:59:27.723425 kernel: iommu: Default domain type: Translated Jan 15 01:59:27.723433 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 15 01:59:27.723441 kernel: efivars: Registered efivars operations Jan 15 01:59:27.723451 kernel: PCI: Using ACPI for IRQ routing Jan 15 01:59:27.723460 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 15 01:59:27.723468 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Jan 15 01:59:27.723476 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Jan 15 01:59:27.723484 kernel: e820: reserve RAM buffer [mem 0x7df57018-0x7fffffff] Jan 15 01:59:27.723492 kernel: e820: reserve RAM buffer [mem 0x7df7f018-0x7fffffff] Jan 15 01:59:27.723500 kernel: e820: reserve RAM buffer [mem 0x7e93f000-0x7fffffff] Jan 15 01:59:27.723510 kernel: e820: reserve RAM buffer [mem 0x7ec71000-0x7fffffff] Jan 15 01:59:27.723518 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Jan 15 01:59:27.723526 kernel: e820: reserve RAM buffer [mem 0x7feaf000-0x7fffffff] Jan 15 01:59:27.723534 kernel: e820: reserve RAM buffer [mem 0x7feec000-0x7fffffff] Jan 15 01:59:27.723629 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 15 01:59:27.723728 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 15 01:59:27.723822 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 15 01:59:27.723835 kernel: vgaarb: loaded Jan 15 01:59:27.723843 kernel: clocksource: Switched to clocksource kvm-clock Jan 15 01:59:27.723852 kernel: VFS: Disk quotas dquot_6.6.0 Jan 15 01:59:27.723860 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 15 01:59:27.723869 kernel: pnp: PnP ACPI init Jan 15 01:59:27.723975 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Jan 15 01:59:27.723989 kernel: pnp: PnP ACPI: found 5 devices Jan 15 01:59:27.723997 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 15 01:59:27.724006 kernel: NET: Registered PF_INET protocol family Jan 15 01:59:27.724015 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 15 01:59:27.724024 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jan 15 01:59:27.724032 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 15 01:59:27.724041 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 15 01:59:27.724051 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jan 15 01:59:27.724065 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jan 15 01:59:27.724074 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 15 01:59:27.724082 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 15 01:59:27.724090 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 15 01:59:27.724099 kernel: NET: Registered PF_XDP protocol family Jan 15 01:59:27.724201 kernel: pci 0000:03:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Jan 15 01:59:27.724305 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 15 01:59:27.724401 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 15 01:59:27.724500 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 15 01:59:27.724601 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 15 01:59:27.724697 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 15 01:59:27.724793 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 15 01:59:27.724891 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 15 01:59:27.724987 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 15 01:59:27.725098 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 15 01:59:27.725224 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 15 01:59:27.725321 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 15 01:59:27.725418 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 15 01:59:27.725513 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 15 01:59:27.725611 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 15 01:59:27.725707 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 15 01:59:27.725803 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 15 01:59:27.725898 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 15 01:59:27.725993 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 15 01:59:27.726100 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 15 01:59:27.726199 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 15 01:59:27.726294 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 15 01:59:27.726388 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 15 01:59:27.726482 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 15 01:59:27.726577 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 15 01:59:27.726671 kernel: pci 0000:00:05.1: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 15 01:59:27.726767 kernel: pci 0000:00:05.2: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 15 01:59:27.726863 kernel: pci 0000:00:05.3: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 15 01:59:27.726958 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 15 01:59:27.727053 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff]: assigned Jan 15 01:59:27.727164 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff]: assigned Jan 15 01:59:27.727259 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff]: assigned Jan 15 01:59:27.727353 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff]: assigned Jan 15 01:59:27.727451 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff]: assigned Jan 15 01:59:27.727548 kernel: pci 0000:00:02.6: bridge window [io 0x8000-0x8fff]: assigned Jan 15 01:59:27.727642 kernel: pci 0000:00:02.7: bridge window [io 0x9000-0x9fff]: assigned Jan 15 01:59:27.727737 kernel: pci 0000:00:03.0: bridge window [io 0xa000-0xafff]: assigned Jan 15 01:59:27.727831 kernel: pci 0000:00:03.1: bridge window [io 0xb000-0xbfff]: assigned Jan 15 01:59:27.727925 kernel: pci 0000:00:03.2: bridge window [io 0xc000-0xcfff]: assigned Jan 15 01:59:27.728022 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff]: assigned Jan 15 01:59:27.728139 kernel: pci 0000:00:03.4: bridge window [io 0xe000-0xefff]: assigned Jan 15 01:59:27.728235 kernel: pci 0000:00:03.5: bridge window [io 0xf000-0xffff]: assigned Jan 15 01:59:27.728328 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.728422 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.728516 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.728610 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.728708 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.728802 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.728896 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.728990 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.729098 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.729206 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.729306 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.729401 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.729495 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.729589 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.729686 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.729784 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.729883 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.729979 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.730085 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.730182 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.730279 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.730376 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.730473 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.730572 kernel: pci 0000:00:05.1: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.730669 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.730765 kernel: pci 0000:00:05.2: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.730860 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.730957 kernel: pci 0000:00:05.3: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.731053 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.731157 kernel: pci 0000:00:05.4: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.731250 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff]: assigned Jan 15 01:59:27.731344 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff]: assigned Jan 15 01:59:27.731438 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff]: assigned Jan 15 01:59:27.731532 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff]: assigned Jan 15 01:59:27.731627 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff]: assigned Jan 15 01:59:27.731720 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff]: assigned Jan 15 01:59:27.731816 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff]: assigned Jan 15 01:59:27.731910 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff]: assigned Jan 15 01:59:27.732004 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff]: assigned Jan 15 01:59:27.732113 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff]: assigned Jan 15 01:59:27.732207 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff]: assigned Jan 15 01:59:27.732301 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff]: assigned Jan 15 01:59:27.732398 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff]: assigned Jan 15 01:59:27.732491 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.732585 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.732679 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.732774 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.732867 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.732961 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.733087 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.733192 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.733287 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.733382 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.733476 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.733570 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.733668 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.733762 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.733857 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.733951 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.734046 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.734156 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.734250 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.734348 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.734443 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.734537 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.734632 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.734726 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.734820 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.734916 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.735011 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.735116 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.735213 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 15 01:59:27.735309 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 15 01:59:27.735409 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 15 01:59:27.735507 kernel: pci 0000:01:00.0: bridge window [io 0x6000-0x6fff] Jan 15 01:59:27.735609 kernel: pci 0000:01:00.0: bridge window [mem 0x84000000-0x841fffff] Jan 15 01:59:27.735707 kernel: pci 0000:01:00.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 15 01:59:27.735803 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 15 01:59:27.735899 kernel: pci 0000:00:02.0: bridge window [io 0x6000-0x6fff] Jan 15 01:59:27.735994 kernel: pci 0000:00:02.0: bridge window [mem 0x84000000-0x842fffff] Jan 15 01:59:27.736104 kernel: pci 0000:00:02.0: bridge window [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 15 01:59:27.736204 kernel: pci 0000:03:00.0: ROM [mem 0x83e80000-0x83efffff pref]: assigned Jan 15 01:59:27.736302 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 15 01:59:27.736397 kernel: pci 0000:00:02.1: bridge window [mem 0x83e00000-0x83ffffff] Jan 15 01:59:27.736492 kernel: pci 0000:00:02.1: bridge window [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 15 01:59:27.736587 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 15 01:59:27.736682 kernel: pci 0000:00:02.2: bridge window [mem 0x83c00000-0x83dfffff] Jan 15 01:59:27.736777 kernel: pci 0000:00:02.2: bridge window [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 15 01:59:27.736871 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 15 01:59:27.736966 kernel: pci 0000:00:02.3: bridge window [mem 0x83a00000-0x83bfffff] Jan 15 01:59:27.737072 kernel: pci 0000:00:02.3: bridge window [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 15 01:59:27.737185 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 15 01:59:27.737279 kernel: pci 0000:00:02.4: bridge window [mem 0x83800000-0x839fffff] Jan 15 01:59:27.737373 kernel: pci 0000:00:02.4: bridge window [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 15 01:59:27.737467 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 15 01:59:27.737560 kernel: pci 0000:00:02.5: bridge window [mem 0x83600000-0x837fffff] Jan 15 01:59:27.737654 kernel: pci 0000:00:02.5: bridge window [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 15 01:59:27.737753 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 15 01:59:27.737846 kernel: pci 0000:00:02.6: bridge window [mem 0x83400000-0x835fffff] Jan 15 01:59:27.737940 kernel: pci 0000:00:02.6: bridge window [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 15 01:59:27.738034 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 15 01:59:27.738142 kernel: pci 0000:00:02.7: bridge window [mem 0x83200000-0x833fffff] Jan 15 01:59:27.738239 kernel: pci 0000:00:02.7: bridge window [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 15 01:59:27.738334 kernel: pci 0000:00:03.0: PCI bridge to [bus 0a] Jan 15 01:59:27.738427 kernel: pci 0000:00:03.0: bridge window [mem 0x83000000-0x831fffff] Jan 15 01:59:27.738521 kernel: pci 0000:00:03.0: bridge window [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 15 01:59:27.738616 kernel: pci 0000:00:03.1: PCI bridge to [bus 0b] Jan 15 01:59:27.738709 kernel: pci 0000:00:03.1: bridge window [mem 0x82e00000-0x82ffffff] Jan 15 01:59:27.738803 kernel: pci 0000:00:03.1: bridge window [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 15 01:59:27.738897 kernel: pci 0000:00:03.2: PCI bridge to [bus 0c] Jan 15 01:59:27.738994 kernel: pci 0000:00:03.2: bridge window [mem 0x82c00000-0x82dfffff] Jan 15 01:59:27.739099 kernel: pci 0000:00:03.2: bridge window [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 15 01:59:27.739195 kernel: pci 0000:00:03.3: PCI bridge to [bus 0d] Jan 15 01:59:27.739291 kernel: pci 0000:00:03.3: bridge window [mem 0x82a00000-0x82bfffff] Jan 15 01:59:27.739386 kernel: pci 0000:00:03.3: bridge window [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 15 01:59:27.739481 kernel: pci 0000:00:03.4: PCI bridge to [bus 0e] Jan 15 01:59:27.739576 kernel: pci 0000:00:03.4: bridge window [mem 0x82800000-0x829fffff] Jan 15 01:59:27.739674 kernel: pci 0000:00:03.4: bridge window [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 15 01:59:27.739769 kernel: pci 0000:00:03.5: PCI bridge to [bus 0f] Jan 15 01:59:27.741001 kernel: pci 0000:00:03.5: bridge window [mem 0x82600000-0x827fffff] Jan 15 01:59:27.741136 kernel: pci 0000:00:03.5: bridge window [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 15 01:59:27.741249 kernel: pci 0000:00:03.6: PCI bridge to [bus 10] Jan 15 01:59:27.741346 kernel: pci 0000:00:03.6: bridge window [mem 0x82400000-0x825fffff] Jan 15 01:59:27.741442 kernel: pci 0000:00:03.6: bridge window [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 15 01:59:27.741542 kernel: pci 0000:00:03.7: PCI bridge to [bus 11] Jan 15 01:59:27.741636 kernel: pci 0000:00:03.7: bridge window [mem 0x82200000-0x823fffff] Jan 15 01:59:27.741732 kernel: pci 0000:00:03.7: bridge window [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 15 01:59:27.741827 kernel: pci 0000:00:04.0: PCI bridge to [bus 12] Jan 15 01:59:27.741921 kernel: pci 0000:00:04.0: bridge window [io 0xf000-0xffff] Jan 15 01:59:27.742017 kernel: pci 0000:00:04.0: bridge window [mem 0x82000000-0x821fffff] Jan 15 01:59:27.742360 kernel: pci 0000:00:04.0: bridge window [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 15 01:59:27.742465 kernel: pci 0000:00:04.1: PCI bridge to [bus 13] Jan 15 01:59:27.742560 kernel: pci 0000:00:04.1: bridge window [io 0xe000-0xefff] Jan 15 01:59:27.742654 kernel: pci 0000:00:04.1: bridge window [mem 0x81e00000-0x81ffffff] Jan 15 01:59:27.742748 kernel: pci 0000:00:04.1: bridge window [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 15 01:59:27.742843 kernel: pci 0000:00:04.2: PCI bridge to [bus 14] Jan 15 01:59:27.742937 kernel: pci 0000:00:04.2: bridge window [io 0xd000-0xdfff] Jan 15 01:59:27.744869 kernel: pci 0000:00:04.2: bridge window [mem 0x81c00000-0x81dfffff] Jan 15 01:59:27.744977 kernel: pci 0000:00:04.2: bridge window [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 15 01:59:27.745086 kernel: pci 0000:00:04.3: PCI bridge to [bus 15] Jan 15 01:59:27.745203 kernel: pci 0000:00:04.3: bridge window [io 0xc000-0xcfff] Jan 15 01:59:27.745300 kernel: pci 0000:00:04.3: bridge window [mem 0x81a00000-0x81bfffff] Jan 15 01:59:27.745394 kernel: pci 0000:00:04.3: bridge window [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 15 01:59:27.745495 kernel: pci 0000:00:04.4: PCI bridge to [bus 16] Jan 15 01:59:27.745589 kernel: pci 0000:00:04.4: bridge window [io 0xb000-0xbfff] Jan 15 01:59:27.745684 kernel: pci 0000:00:04.4: bridge window [mem 0x81800000-0x819fffff] Jan 15 01:59:27.745778 kernel: pci 0000:00:04.4: bridge window [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 15 01:59:27.745876 kernel: pci 0000:00:04.5: PCI bridge to [bus 17] Jan 15 01:59:27.745970 kernel: pci 0000:00:04.5: bridge window [io 0xa000-0xafff] Jan 15 01:59:27.746090 kernel: pci 0000:00:04.5: bridge window [mem 0x81600000-0x817fffff] Jan 15 01:59:27.747638 kernel: pci 0000:00:04.5: bridge window [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 15 01:59:27.747740 kernel: pci 0000:00:04.6: PCI bridge to [bus 18] Jan 15 01:59:27.747835 kernel: pci 0000:00:04.6: bridge window [io 0x9000-0x9fff] Jan 15 01:59:27.747931 kernel: pci 0000:00:04.6: bridge window [mem 0x81400000-0x815fffff] Jan 15 01:59:27.748027 kernel: pci 0000:00:04.6: bridge window [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 15 01:59:27.748144 kernel: pci 0000:00:04.7: PCI bridge to [bus 19] Jan 15 01:59:27.748239 kernel: pci 0000:00:04.7: bridge window [io 0x8000-0x8fff] Jan 15 01:59:27.748333 kernel: pci 0000:00:04.7: bridge window [mem 0x81200000-0x813fffff] Jan 15 01:59:27.748427 kernel: pci 0000:00:04.7: bridge window [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 15 01:59:27.748522 kernel: pci 0000:00:05.0: PCI bridge to [bus 1a] Jan 15 01:59:27.748616 kernel: pci 0000:00:05.0: bridge window [io 0x5000-0x5fff] Jan 15 01:59:27.748714 kernel: pci 0000:00:05.0: bridge window [mem 0x81000000-0x811fffff] Jan 15 01:59:27.748809 kernel: pci 0000:00:05.0: bridge window [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 15 01:59:27.748903 kernel: pci 0000:00:05.1: PCI bridge to [bus 1b] Jan 15 01:59:27.748998 kernel: pci 0000:00:05.1: bridge window [io 0x4000-0x4fff] Jan 15 01:59:27.749102 kernel: pci 0000:00:05.1: bridge window [mem 0x80e00000-0x80ffffff] Jan 15 01:59:27.749209 kernel: pci 0000:00:05.1: bridge window [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 15 01:59:27.749308 kernel: pci 0000:00:05.2: PCI bridge to [bus 1c] Jan 15 01:59:27.749405 kernel: pci 0000:00:05.2: bridge window [io 0x3000-0x3fff] Jan 15 01:59:27.749500 kernel: pci 0000:00:05.2: bridge window [mem 0x80c00000-0x80dfffff] Jan 15 01:59:27.749596 kernel: pci 0000:00:05.2: bridge window [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 15 01:59:27.749693 kernel: pci 0000:00:05.3: PCI bridge to [bus 1d] Jan 15 01:59:27.749788 kernel: pci 0000:00:05.3: bridge window [io 0x2000-0x2fff] Jan 15 01:59:27.749884 kernel: pci 0000:00:05.3: bridge window [mem 0x80a00000-0x80bfffff] Jan 15 01:59:27.749978 kernel: pci 0000:00:05.3: bridge window [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 15 01:59:27.750080 kernel: pci 0000:00:05.4: PCI bridge to [bus 1e] Jan 15 01:59:27.750182 kernel: pci 0000:00:05.4: bridge window [io 0x1000-0x1fff] Jan 15 01:59:27.750278 kernel: pci 0000:00:05.4: bridge window [mem 0x80800000-0x809fffff] Jan 15 01:59:27.750372 kernel: pci 0000:00:05.4: bridge window [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 15 01:59:27.750466 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 15 01:59:27.750554 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 15 01:59:27.750640 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 15 01:59:27.750726 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Jan 15 01:59:27.750811 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Jan 15 01:59:27.750896 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x38e800003fff window] Jan 15 01:59:27.750999 kernel: pci_bus 0000:01: resource 0 [io 0x6000-0x6fff] Jan 15 01:59:27.751102 kernel: pci_bus 0000:01: resource 1 [mem 0x84000000-0x842fffff] Jan 15 01:59:27.751193 kernel: pci_bus 0000:01: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 15 01:59:27.751287 kernel: pci_bus 0000:02: resource 0 [io 0x6000-0x6fff] Jan 15 01:59:27.751380 kernel: pci_bus 0000:02: resource 1 [mem 0x84000000-0x841fffff] Jan 15 01:59:27.751471 kernel: pci_bus 0000:02: resource 2 [mem 0x380000000000-0x3807ffffffff 64bit pref] Jan 15 01:59:27.751570 kernel: pci_bus 0000:03: resource 1 [mem 0x83e00000-0x83ffffff] Jan 15 01:59:27.751661 kernel: pci_bus 0000:03: resource 2 [mem 0x380800000000-0x380fffffffff 64bit pref] Jan 15 01:59:27.751758 kernel: pci_bus 0000:04: resource 1 [mem 0x83c00000-0x83dfffff] Jan 15 01:59:27.751848 kernel: pci_bus 0000:04: resource 2 [mem 0x381000000000-0x3817ffffffff 64bit pref] Jan 15 01:59:27.751945 kernel: pci_bus 0000:05: resource 1 [mem 0x83a00000-0x83bfffff] Jan 15 01:59:27.752039 kernel: pci_bus 0000:05: resource 2 [mem 0x381800000000-0x381fffffffff 64bit pref] Jan 15 01:59:27.752153 kernel: pci_bus 0000:06: resource 1 [mem 0x83800000-0x839fffff] Jan 15 01:59:27.752242 kernel: pci_bus 0000:06: resource 2 [mem 0x382000000000-0x3827ffffffff 64bit pref] Jan 15 01:59:27.752339 kernel: pci_bus 0000:07: resource 1 [mem 0x83600000-0x837fffff] Jan 15 01:59:27.752428 kernel: pci_bus 0000:07: resource 2 [mem 0x382800000000-0x382fffffffff 64bit pref] Jan 15 01:59:27.752525 kernel: pci_bus 0000:08: resource 1 [mem 0x83400000-0x835fffff] Jan 15 01:59:27.752617 kernel: pci_bus 0000:08: resource 2 [mem 0x383000000000-0x3837ffffffff 64bit pref] Jan 15 01:59:27.752710 kernel: pci_bus 0000:09: resource 1 [mem 0x83200000-0x833fffff] Jan 15 01:59:27.752799 kernel: pci_bus 0000:09: resource 2 [mem 0x383800000000-0x383fffffffff 64bit pref] Jan 15 01:59:27.752893 kernel: pci_bus 0000:0a: resource 1 [mem 0x83000000-0x831fffff] Jan 15 01:59:27.752983 kernel: pci_bus 0000:0a: resource 2 [mem 0x384000000000-0x3847ffffffff 64bit pref] Jan 15 01:59:27.753092 kernel: pci_bus 0000:0b: resource 1 [mem 0x82e00000-0x82ffffff] Jan 15 01:59:27.753192 kernel: pci_bus 0000:0b: resource 2 [mem 0x384800000000-0x384fffffffff 64bit pref] Jan 15 01:59:27.753287 kernel: pci_bus 0000:0c: resource 1 [mem 0x82c00000-0x82dfffff] Jan 15 01:59:27.753376 kernel: pci_bus 0000:0c: resource 2 [mem 0x385000000000-0x3857ffffffff 64bit pref] Jan 15 01:59:27.753473 kernel: pci_bus 0000:0d: resource 1 [mem 0x82a00000-0x82bfffff] Jan 15 01:59:27.753562 kernel: pci_bus 0000:0d: resource 2 [mem 0x385800000000-0x385fffffffff 64bit pref] Jan 15 01:59:27.753656 kernel: pci_bus 0000:0e: resource 1 [mem 0x82800000-0x829fffff] Jan 15 01:59:27.753745 kernel: pci_bus 0000:0e: resource 2 [mem 0x386000000000-0x3867ffffffff 64bit pref] Jan 15 01:59:27.753841 kernel: pci_bus 0000:0f: resource 1 [mem 0x82600000-0x827fffff] Jan 15 01:59:27.753934 kernel: pci_bus 0000:0f: resource 2 [mem 0x386800000000-0x386fffffffff 64bit pref] Jan 15 01:59:27.754028 kernel: pci_bus 0000:10: resource 1 [mem 0x82400000-0x825fffff] Jan 15 01:59:27.754125 kernel: pci_bus 0000:10: resource 2 [mem 0x387000000000-0x3877ffffffff 64bit pref] Jan 15 01:59:27.754220 kernel: pci_bus 0000:11: resource 1 [mem 0x82200000-0x823fffff] Jan 15 01:59:27.754310 kernel: pci_bus 0000:11: resource 2 [mem 0x387800000000-0x387fffffffff 64bit pref] Jan 15 01:59:27.754404 kernel: pci_bus 0000:12: resource 0 [io 0xf000-0xffff] Jan 15 01:59:27.754497 kernel: pci_bus 0000:12: resource 1 [mem 0x82000000-0x821fffff] Jan 15 01:59:27.754587 kernel: pci_bus 0000:12: resource 2 [mem 0x388000000000-0x3887ffffffff 64bit pref] Jan 15 01:59:27.754686 kernel: pci_bus 0000:13: resource 0 [io 0xe000-0xefff] Jan 15 01:59:27.754778 kernel: pci_bus 0000:13: resource 1 [mem 0x81e00000-0x81ffffff] Jan 15 01:59:27.754869 kernel: pci_bus 0000:13: resource 2 [mem 0x388800000000-0x388fffffffff 64bit pref] Jan 15 01:59:27.754966 kernel: pci_bus 0000:14: resource 0 [io 0xd000-0xdfff] Jan 15 01:59:27.755062 kernel: pci_bus 0000:14: resource 1 [mem 0x81c00000-0x81dfffff] Jan 15 01:59:27.755161 kernel: pci_bus 0000:14: resource 2 [mem 0x389000000000-0x3897ffffffff 64bit pref] Jan 15 01:59:27.755255 kernel: pci_bus 0000:15: resource 0 [io 0xc000-0xcfff] Jan 15 01:59:27.755344 kernel: pci_bus 0000:15: resource 1 [mem 0x81a00000-0x81bfffff] Jan 15 01:59:27.755432 kernel: pci_bus 0000:15: resource 2 [mem 0x389800000000-0x389fffffffff 64bit pref] Jan 15 01:59:27.755529 kernel: pci_bus 0000:16: resource 0 [io 0xb000-0xbfff] Jan 15 01:59:27.755618 kernel: pci_bus 0000:16: resource 1 [mem 0x81800000-0x819fffff] Jan 15 01:59:27.755706 kernel: pci_bus 0000:16: resource 2 [mem 0x38a000000000-0x38a7ffffffff 64bit pref] Jan 15 01:59:27.755800 kernel: pci_bus 0000:17: resource 0 [io 0xa000-0xafff] Jan 15 01:59:27.755890 kernel: pci_bus 0000:17: resource 1 [mem 0x81600000-0x817fffff] Jan 15 01:59:27.755980 kernel: pci_bus 0000:17: resource 2 [mem 0x38a800000000-0x38afffffffff 64bit pref] Jan 15 01:59:27.756087 kernel: pci_bus 0000:18: resource 0 [io 0x9000-0x9fff] Jan 15 01:59:27.756178 kernel: pci_bus 0000:18: resource 1 [mem 0x81400000-0x815fffff] Jan 15 01:59:27.756266 kernel: pci_bus 0000:18: resource 2 [mem 0x38b000000000-0x38b7ffffffff 64bit pref] Jan 15 01:59:27.756359 kernel: pci_bus 0000:19: resource 0 [io 0x8000-0x8fff] Jan 15 01:59:27.756448 kernel: pci_bus 0000:19: resource 1 [mem 0x81200000-0x813fffff] Jan 15 01:59:27.756538 kernel: pci_bus 0000:19: resource 2 [mem 0x38b800000000-0x38bfffffffff 64bit pref] Jan 15 01:59:27.756631 kernel: pci_bus 0000:1a: resource 0 [io 0x5000-0x5fff] Jan 15 01:59:27.756720 kernel: pci_bus 0000:1a: resource 1 [mem 0x81000000-0x811fffff] Jan 15 01:59:27.756808 kernel: pci_bus 0000:1a: resource 2 [mem 0x38c000000000-0x38c7ffffffff 64bit pref] Jan 15 01:59:27.756902 kernel: pci_bus 0000:1b: resource 0 [io 0x4000-0x4fff] Jan 15 01:59:27.756991 kernel: pci_bus 0000:1b: resource 1 [mem 0x80e00000-0x80ffffff] Jan 15 01:59:27.757098 kernel: pci_bus 0000:1b: resource 2 [mem 0x38c800000000-0x38cfffffffff 64bit pref] Jan 15 01:59:27.757232 kernel: pci_bus 0000:1c: resource 0 [io 0x3000-0x3fff] Jan 15 01:59:27.757324 kernel: pci_bus 0000:1c: resource 1 [mem 0x80c00000-0x80dfffff] Jan 15 01:59:27.757413 kernel: pci_bus 0000:1c: resource 2 [mem 0x38d000000000-0x38d7ffffffff 64bit pref] Jan 15 01:59:27.757507 kernel: pci_bus 0000:1d: resource 0 [io 0x2000-0x2fff] Jan 15 01:59:27.757600 kernel: pci_bus 0000:1d: resource 1 [mem 0x80a00000-0x80bfffff] Jan 15 01:59:27.757689 kernel: pci_bus 0000:1d: resource 2 [mem 0x38d800000000-0x38dfffffffff 64bit pref] Jan 15 01:59:27.757784 kernel: pci_bus 0000:1e: resource 0 [io 0x1000-0x1fff] Jan 15 01:59:27.757873 kernel: pci_bus 0000:1e: resource 1 [mem 0x80800000-0x809fffff] Jan 15 01:59:27.757962 kernel: pci_bus 0000:1e: resource 2 [mem 0x38e000000000-0x38e7ffffffff 64bit pref] Jan 15 01:59:27.757973 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 15 01:59:27.757984 kernel: PCI: CLS 0 bytes, default 64 Jan 15 01:59:27.757993 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 15 01:59:27.758001 kernel: software IO TLB: mapped [mem 0x0000000077ede000-0x000000007bede000] (64MB) Jan 15 01:59:27.758010 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 15 01:59:27.758018 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x21134f58f0d, max_idle_ns: 440795217993 ns Jan 15 01:59:27.758027 kernel: Initialise system trusted keyrings Jan 15 01:59:27.758036 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jan 15 01:59:27.758046 kernel: Key type asymmetric registered Jan 15 01:59:27.758054 kernel: Asymmetric key parser 'x509' registered Jan 15 01:59:27.758073 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jan 15 01:59:27.758081 kernel: io scheduler mq-deadline registered Jan 15 01:59:27.758089 kernel: io scheduler kyber registered Jan 15 01:59:27.758098 kernel: io scheduler bfq registered Jan 15 01:59:27.758197 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 15 01:59:27.758298 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 15 01:59:27.758394 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 15 01:59:27.758492 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 15 01:59:27.758587 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 15 01:59:27.758684 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 15 01:59:27.758782 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 15 01:59:27.758878 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 15 01:59:27.758973 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 15 01:59:27.759084 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 15 01:59:27.759181 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 15 01:59:27.759279 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 15 01:59:27.759374 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 15 01:59:27.759469 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 15 01:59:27.759563 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 15 01:59:27.759659 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 15 01:59:27.759672 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 15 01:59:27.759766 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Jan 15 01:59:27.759862 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Jan 15 01:59:27.759957 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33 Jan 15 01:59:27.760052 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33 Jan 15 01:59:27.760162 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34 Jan 15 01:59:27.760258 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34 Jan 15 01:59:27.760353 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35 Jan 15 01:59:27.760448 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35 Jan 15 01:59:27.760542 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36 Jan 15 01:59:27.760639 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36 Jan 15 01:59:27.760734 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37 Jan 15 01:59:27.760829 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37 Jan 15 01:59:27.760923 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38 Jan 15 01:59:27.761017 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38 Jan 15 01:59:27.761129 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39 Jan 15 01:59:27.761233 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39 Jan 15 01:59:27.761244 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Jan 15 01:59:27.761338 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40 Jan 15 01:59:27.761432 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40 Jan 15 01:59:27.761527 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 41 Jan 15 01:59:27.761621 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 41 Jan 15 01:59:27.761718 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 42 Jan 15 01:59:27.761812 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 42 Jan 15 01:59:27.761906 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 43 Jan 15 01:59:27.762000 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 43 Jan 15 01:59:27.762104 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 44 Jan 15 01:59:27.762198 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 44 Jan 15 01:59:27.762296 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 45 Jan 15 01:59:27.762390 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 45 Jan 15 01:59:27.762485 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 46 Jan 15 01:59:27.762583 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 46 Jan 15 01:59:27.762679 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 47 Jan 15 01:59:27.762773 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 47 Jan 15 01:59:27.762784 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Jan 15 01:59:27.762878 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 48 Jan 15 01:59:27.762972 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 48 Jan 15 01:59:27.763081 kernel: pcieport 0000:00:05.1: PME: Signaling with IRQ 49 Jan 15 01:59:27.763177 kernel: pcieport 0000:00:05.1: AER: enabled with IRQ 49 Jan 15 01:59:27.763271 kernel: pcieport 0000:00:05.2: PME: Signaling with IRQ 50 Jan 15 01:59:27.763365 kernel: pcieport 0000:00:05.2: AER: enabled with IRQ 50 Jan 15 01:59:27.763458 kernel: pcieport 0000:00:05.3: PME: Signaling with IRQ 51 Jan 15 01:59:27.763555 kernel: pcieport 0000:00:05.3: AER: enabled with IRQ 51 Jan 15 01:59:27.763649 kernel: pcieport 0000:00:05.4: PME: Signaling with IRQ 52 Jan 15 01:59:27.763743 kernel: pcieport 0000:00:05.4: AER: enabled with IRQ 52 Jan 15 01:59:27.763754 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 15 01:59:27.763763 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 15 01:59:27.763771 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 15 01:59:27.763780 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 15 01:59:27.763790 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 15 01:59:27.763799 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 15 01:59:27.763807 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 15 01:59:27.763905 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 15 01:59:27.763997 kernel: rtc_cmos 00:03: registered as rtc0 Jan 15 01:59:27.764097 kernel: rtc_cmos 00:03: setting system clock to 2026-01-15T01:59:25 UTC (1768442365) Jan 15 01:59:27.764191 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Jan 15 01:59:27.764202 kernel: intel_pstate: CPU model not supported Jan 15 01:59:27.764210 kernel: efifb: probing for efifb Jan 15 01:59:27.764219 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Jan 15 01:59:27.764229 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Jan 15 01:59:27.764237 kernel: efifb: scrolling: redraw Jan 15 01:59:27.764245 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Jan 15 01:59:27.764255 kernel: Console: switching to colour frame buffer device 160x50 Jan 15 01:59:27.764263 kernel: fb0: EFI VGA frame buffer device Jan 15 01:59:27.764272 kernel: pstore: Using crash dump compression: deflate Jan 15 01:59:27.764280 kernel: pstore: Registered efi_pstore as persistent store backend Jan 15 01:59:27.764288 kernel: NET: Registered PF_INET6 protocol family Jan 15 01:59:27.764297 kernel: Segment Routing with IPv6 Jan 15 01:59:27.764305 kernel: In-situ OAM (IOAM) with IPv6 Jan 15 01:59:27.764315 kernel: NET: Registered PF_PACKET protocol family Jan 15 01:59:27.764323 kernel: Key type dns_resolver registered Jan 15 01:59:27.764331 kernel: IPI shorthand broadcast: enabled Jan 15 01:59:27.764339 kernel: sched_clock: Marking stable (2693002334, 153521442)->(2951322817, -104799041) Jan 15 01:59:27.764348 kernel: registered taskstats version 1 Jan 15 01:59:27.764356 kernel: Loading compiled-in X.509 certificates Jan 15 01:59:27.764364 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.65-flatcar: e8b6753a1cbf8103f5806ce5d59781743c62fae9' Jan 15 01:59:27.764374 kernel: Demotion targets for Node 0: null Jan 15 01:59:27.764383 kernel: Key type .fscrypt registered Jan 15 01:59:27.764391 kernel: Key type fscrypt-provisioning registered Jan 15 01:59:27.764399 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 15 01:59:27.764407 kernel: ima: Allocated hash algorithm: sha1 Jan 15 01:59:27.764415 kernel: ima: No architecture policies found Jan 15 01:59:27.764424 kernel: clk: Disabling unused clocks Jan 15 01:59:27.764432 kernel: Freeing unused kernel image (initmem) memory: 15432K Jan 15 01:59:27.764442 kernel: Write protecting the kernel read-only data: 45056k Jan 15 01:59:27.764450 kernel: Freeing unused kernel image (rodata/data gap) memory: 824K Jan 15 01:59:27.764459 kernel: Run /init as init process Jan 15 01:59:27.764467 kernel: with arguments: Jan 15 01:59:27.764475 kernel: /init Jan 15 01:59:27.764483 kernel: with environment: Jan 15 01:59:27.764491 kernel: HOME=/ Jan 15 01:59:27.764501 kernel: TERM=linux Jan 15 01:59:27.764509 kernel: SCSI subsystem initialized Jan 15 01:59:27.764517 kernel: libata version 3.00 loaded. Jan 15 01:59:27.764615 kernel: ahci 0000:00:1f.2: version 3.0 Jan 15 01:59:27.764626 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 15 01:59:27.764720 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jan 15 01:59:27.764815 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jan 15 01:59:27.764913 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 15 01:59:27.765024 kernel: scsi host0: ahci Jan 15 01:59:27.765137 kernel: scsi host1: ahci Jan 15 01:59:27.765266 kernel: scsi host2: ahci Jan 15 01:59:27.765367 kernel: scsi host3: ahci Jan 15 01:59:27.765473 kernel: scsi host4: ahci Jan 15 01:59:27.765574 kernel: scsi host5: ahci Jan 15 01:59:27.765585 kernel: ata1: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380100 irq 55 lpm-pol 1 Jan 15 01:59:27.765594 kernel: ata2: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380180 irq 55 lpm-pol 1 Jan 15 01:59:27.765602 kernel: ata3: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380200 irq 55 lpm-pol 1 Jan 15 01:59:27.765611 kernel: ata4: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380280 irq 55 lpm-pol 1 Jan 15 01:59:27.765622 kernel: ata5: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380300 irq 55 lpm-pol 1 Jan 15 01:59:27.765630 kernel: ata6: SATA max UDMA/133 abar m4096@0x84380000 port 0x84380380 irq 55 lpm-pol 1 Jan 15 01:59:27.765639 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 15 01:59:27.765648 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 15 01:59:27.765656 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 15 01:59:27.765664 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 15 01:59:27.765673 kernel: ata1: SATA link down (SStatus 0 SControl 300) Jan 15 01:59:27.765683 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 15 01:59:27.765691 kernel: ACPI: bus type USB registered Jan 15 01:59:27.765700 kernel: usbcore: registered new interface driver usbfs Jan 15 01:59:27.765708 kernel: usbcore: registered new interface driver hub Jan 15 01:59:27.765717 kernel: usbcore: registered new device driver usb Jan 15 01:59:27.765822 kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller Jan 15 01:59:27.765924 kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1 Jan 15 01:59:27.766028 kernel: uhci_hcd 0000:02:01.0: detected 2 ports Jan 15 01:59:27.766146 kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x00006000 Jan 15 01:59:27.766277 kernel: hub 1-0:1.0: USB hub found Jan 15 01:59:27.766388 kernel: hub 1-0:1.0: 2 ports detected Jan 15 01:59:27.766501 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Jan 15 01:59:27.766599 kernel: virtio_blk virtio2: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 15 01:59:27.766612 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 15 01:59:27.766622 kernel: GPT:25804799 != 104857599 Jan 15 01:59:27.766630 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 15 01:59:27.766639 kernel: GPT:25804799 != 104857599 Jan 15 01:59:27.766648 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 15 01:59:27.766656 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 15 01:59:27.766667 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 15 01:59:27.766676 kernel: device-mapper: uevent: version 1.0.3 Jan 15 01:59:27.766685 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 15 01:59:27.766693 kernel: device-mapper: verity: sha256 using shash "sha256-generic" Jan 15 01:59:27.766702 kernel: raid6: avx512x4 gen() 21146 MB/s Jan 15 01:59:27.766711 kernel: raid6: avx512x2 gen() 29600 MB/s Jan 15 01:59:27.766719 kernel: raid6: avx512x1 gen() 32402 MB/s Jan 15 01:59:27.766730 kernel: raid6: avx2x4 gen() 26660 MB/s Jan 15 01:59:27.766738 kernel: raid6: avx2x2 gen() 27868 MB/s Jan 15 01:59:27.766746 kernel: raid6: avx2x1 gen() 26208 MB/s Jan 15 01:59:27.766755 kernel: raid6: using algorithm avx512x1 gen() 32402 MB/s Jan 15 01:59:27.766764 kernel: raid6: .... xor() 24688 MB/s, rmw enabled Jan 15 01:59:27.766774 kernel: raid6: using avx512x2 recovery algorithm Jan 15 01:59:27.766783 kernel: xor: automatically using best checksumming function avx Jan 15 01:59:27.766906 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Jan 15 01:59:27.766919 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 15 01:59:27.766928 kernel: BTRFS: device fsid 1fc5e5ba-2a81-4f9e-b722-a47a3e33c106 devid 1 transid 34 /dev/mapper/usr (253:0) scanned by mount (205) Jan 15 01:59:27.766937 kernel: BTRFS info (device dm-0): first mount of filesystem 1fc5e5ba-2a81-4f9e-b722-a47a3e33c106 Jan 15 01:59:27.766945 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 15 01:59:27.766954 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 15 01:59:27.766965 kernel: BTRFS info (device dm-0): enabling free space tree Jan 15 01:59:27.766973 kernel: loop: module loaded Jan 15 01:59:27.766982 kernel: loop0: detected capacity change from 0 to 100160 Jan 15 01:59:27.766991 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 15 01:59:27.766999 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 15 01:59:27.767008 kernel: usbcore: registered new interface driver usbhid Jan 15 01:59:27.767017 kernel: usbhid: USB HID core driver Jan 15 01:59:27.767028 systemd[1]: Successfully made /usr/ read-only. Jan 15 01:59:27.767040 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 15 01:59:27.767050 systemd[1]: Detected virtualization kvm. Jan 15 01:59:27.767067 systemd[1]: Detected architecture x86-64. Jan 15 01:59:27.767076 systemd[1]: Running in initrd. Jan 15 01:59:27.767084 systemd[1]: No hostname configured, using default hostname. Jan 15 01:59:27.767096 systemd[1]: Hostname set to . Jan 15 01:59:27.767104 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 15 01:59:27.767113 systemd[1]: Queued start job for default target initrd.target. Jan 15 01:59:27.767122 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 15 01:59:27.767131 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 15 01:59:27.767140 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 15 01:59:27.767151 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 15 01:59:27.767161 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 15 01:59:27.767170 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 15 01:59:27.767179 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 15 01:59:27.767188 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 15 01:59:27.767197 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 15 01:59:27.767208 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 15 01:59:27.767217 systemd[1]: Reached target paths.target - Path Units. Jan 15 01:59:27.767226 systemd[1]: Reached target slices.target - Slice Units. Jan 15 01:59:27.767235 systemd[1]: Reached target swap.target - Swaps. Jan 15 01:59:27.767244 systemd[1]: Reached target timers.target - Timer Units. Jan 15 01:59:27.767253 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 15 01:59:27.767262 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 15 01:59:27.767273 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 15 01:59:27.767283 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 15 01:59:27.767292 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 15 01:59:27.767301 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 15 01:59:27.767310 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 15 01:59:27.767319 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 15 01:59:27.767328 systemd[1]: Reached target sockets.target - Socket Units. Jan 15 01:59:27.767339 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 15 01:59:27.767348 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 15 01:59:27.767357 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 15 01:59:27.767366 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 15 01:59:27.767375 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 15 01:59:27.767384 systemd[1]: Starting systemd-fsck-usr.service... Jan 15 01:59:27.767395 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 15 01:59:27.767404 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 15 01:59:27.767413 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 01:59:27.767441 systemd-journald[342]: Collecting audit messages is enabled. Jan 15 01:59:27.767464 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 15 01:59:27.767474 kernel: audit: type=1130 audit(1768442367.683:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.767483 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 15 01:59:27.767493 systemd[1]: Finished systemd-fsck-usr.service. Jan 15 01:59:27.767503 kernel: audit: type=1130 audit(1768442367.690:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.767512 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 15 01:59:27.767521 kernel: audit: type=1130 audit(1768442367.697:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.767531 systemd-journald[342]: Journal started Jan 15 01:59:27.767553 systemd-journald[342]: Runtime Journal (/run/log/journal/daf2ebc79e064bccabee4c74708a9509) is 8M, max 77.9M, 69.9M free. Jan 15 01:59:27.769777 systemd[1]: Started systemd-journald.service - Journal Service. Jan 15 01:59:27.769806 kernel: audit: type=1130 audit(1768442367.769:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.683000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.690000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.697000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.769000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.769762 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 15 01:59:27.774000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.785187 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 15 01:59:27.788641 kernel: audit: type=1130 audit(1768442367.774:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.793412 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 15 01:59:27.796353 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 15 01:59:27.800440 systemd-modules-load[345]: Inserted module 'br_netfilter' Jan 15 01:59:27.801789 kernel: Bridge firewalling registered Jan 15 01:59:27.801815 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 01:59:27.809418 kernel: audit: type=1130 audit(1768442367.803:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.803000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.803826 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 15 01:59:27.807598 systemd-tmpfiles[361]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 15 01:59:27.818493 kernel: audit: type=1130 audit(1768442367.811:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.811000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.814133 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 15 01:59:27.821320 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 15 01:59:27.832665 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 15 01:59:27.837576 kernel: audit: type=1130 audit(1768442367.833:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.833000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.839144 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 15 01:59:27.839000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.846079 kernel: audit: type=1130 audit(1768442367.839:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.846393 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 15 01:59:27.846000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.847000 audit: BPF prog-id=6 op=LOAD Jan 15 01:59:27.850212 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 15 01:59:27.859504 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 15 01:59:27.860000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.861813 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 15 01:59:27.890757 dracut-cmdline[386]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=1042e64ca7212ba2a277cb872bdf1dc4e195c9fb8110078c443b3efbd2488cb9 Jan 15 01:59:27.900595 systemd-resolved[378]: Positive Trust Anchors: Jan 15 01:59:27.900605 systemd-resolved[378]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 15 01:59:27.900609 systemd-resolved[378]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 15 01:59:27.900640 systemd-resolved[378]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 15 01:59:27.928916 systemd-resolved[378]: Defaulting to hostname 'linux'. Jan 15 01:59:27.930225 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 15 01:59:27.935484 kernel: kauditd_printk_skb: 3 callbacks suppressed Jan 15 01:59:27.935507 kernel: audit: type=1130 audit(1768442367.930:14): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.930000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:27.930831 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 15 01:59:27.985084 kernel: Loading iSCSI transport class v2.0-870. Jan 15 01:59:28.004113 kernel: iscsi: registered transport (tcp) Jan 15 01:59:28.029254 kernel: iscsi: registered transport (qla4xxx) Jan 15 01:59:28.029292 kernel: QLogic iSCSI HBA Driver Jan 15 01:59:28.066351 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 15 01:59:28.086746 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 15 01:59:28.088000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.093477 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 15 01:59:28.098109 kernel: audit: type=1130 audit(1768442368.088:15): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.167404 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 15 01:59:28.168000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.175137 kernel: audit: type=1130 audit(1768442368.168:16): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.175256 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 15 01:59:28.178269 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 15 01:59:28.223186 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 15 01:59:28.223000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.227203 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 15 01:59:28.234086 kernel: audit: type=1130 audit(1768442368.223:17): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.234112 kernel: audit: type=1334 audit(1768442368.225:18): prog-id=7 op=LOAD Jan 15 01:59:28.234131 kernel: audit: type=1334 audit(1768442368.225:19): prog-id=8 op=LOAD Jan 15 01:59:28.225000 audit: BPF prog-id=7 op=LOAD Jan 15 01:59:28.225000 audit: BPF prog-id=8 op=LOAD Jan 15 01:59:28.260332 systemd-udevd[625]: Using default interface naming scheme 'v257'. Jan 15 01:59:28.269978 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 15 01:59:28.270000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.278070 kernel: audit: type=1130 audit(1768442368.270:20): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.276097 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 15 01:59:28.300605 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 15 01:59:28.305310 kernel: audit: type=1130 audit(1768442368.301:21): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.301000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.305367 dracut-pre-trigger[688]: rd.md=0: removing MD RAID activation Jan 15 01:59:28.310947 kernel: audit: type=1334 audit(1768442368.302:22): prog-id=9 op=LOAD Jan 15 01:59:28.302000 audit: BPF prog-id=9 op=LOAD Jan 15 01:59:28.304170 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 15 01:59:28.337008 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 15 01:59:28.337000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.338695 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 15 01:59:28.342890 kernel: audit: type=1130 audit(1768442368.337:23): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.355855 systemd-networkd[734]: lo: Link UP Jan 15 01:59:28.355860 systemd-networkd[734]: lo: Gained carrier Jan 15 01:59:28.357000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.356837 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 15 01:59:28.357519 systemd[1]: Reached target network.target - Network. Jan 15 01:59:28.426801 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 15 01:59:28.427000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.429113 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 15 01:59:28.519815 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 15 01:59:28.537742 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 15 01:59:28.545114 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 15 01:59:28.555169 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 15 01:59:28.561072 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Jan 15 01:59:28.564076 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0 Jan 15 01:59:28.576036 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 15 01:59:28.577977 disk-uuid[795]: Primary Header is updated. Jan 15 01:59:28.577977 disk-uuid[795]: Secondary Entries is updated. Jan 15 01:59:28.577977 disk-uuid[795]: Secondary Header is updated. Jan 15 01:59:28.624076 kernel: cryptd: max_cpu_qlen set to 1000 Jan 15 01:59:28.625590 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 15 01:59:28.625692 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 01:59:28.628000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.628584 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 01:59:28.635981 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 01:59:28.654011 systemd-networkd[734]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 15 01:59:28.654992 systemd-networkd[734]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 15 01:59:28.656128 systemd-networkd[734]: eth0: Link UP Jan 15 01:59:28.656276 systemd-networkd[734]: eth0: Gained carrier Jan 15 01:59:28.656287 systemd-networkd[734]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 15 01:59:28.667181 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 15 01:59:28.667774 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 01:59:28.668000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.668000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.671707 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 01:59:28.673748 kernel: AES CTR mode by8 optimization enabled Jan 15 01:59:28.695135 systemd-networkd[734]: eth0: DHCPv4 address 10.0.1.164/25, gateway 10.0.1.129 acquired from 10.0.1.129 Jan 15 01:59:28.707397 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Jan 15 01:59:28.736964 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 01:59:28.738000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.769117 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 15 01:59:28.770000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:28.770941 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 15 01:59:28.771975 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 15 01:59:28.772855 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 15 01:59:28.774537 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 15 01:59:28.795164 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 15 01:59:28.795000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:29.666940 disk-uuid[796]: Warning: The kernel is still using the old partition table. Jan 15 01:59:29.666940 disk-uuid[796]: The new table will be used at the next reboot or after you Jan 15 01:59:29.666940 disk-uuid[796]: run partprobe(8) or kpartx(8) Jan 15 01:59:29.666940 disk-uuid[796]: The operation has completed successfully. Jan 15 01:59:29.678807 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 15 01:59:29.679143 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 15 01:59:29.683000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:29.683000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:29.687179 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 15 01:59:29.694290 systemd-networkd[734]: eth0: Gained IPv6LL Jan 15 01:59:29.776139 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (920) Jan 15 01:59:29.787121 kernel: BTRFS info (device vda6): first mount of filesystem 372d586b-dfcb-4c9b-8d15-cc0618567790 Jan 15 01:59:29.787203 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 15 01:59:29.802644 kernel: BTRFS info (device vda6): turning on async discard Jan 15 01:59:29.802720 kernel: BTRFS info (device vda6): enabling free space tree Jan 15 01:59:29.821175 kernel: BTRFS info (device vda6): last unmount of filesystem 372d586b-dfcb-4c9b-8d15-cc0618567790 Jan 15 01:59:29.822409 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 15 01:59:29.823000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:29.827409 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 15 01:59:30.216290 ignition[939]: Ignition 2.22.0 Jan 15 01:59:30.217372 ignition[939]: Stage: fetch-offline Jan 15 01:59:30.218106 ignition[939]: no configs at "/usr/lib/ignition/base.d" Jan 15 01:59:30.218731 ignition[939]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 01:59:30.219428 ignition[939]: parsed url from cmdline: "" Jan 15 01:59:30.219433 ignition[939]: no config URL provided Jan 15 01:59:30.219441 ignition[939]: reading system config file "/usr/lib/ignition/user.ign" Jan 15 01:59:30.219454 ignition[939]: no config at "/usr/lib/ignition/user.ign" Jan 15 01:59:30.219461 ignition[939]: failed to fetch config: resource requires networking Jan 15 01:59:30.223000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:30.222511 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 15 01:59:30.220382 ignition[939]: Ignition finished successfully Jan 15 01:59:30.226205 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 15 01:59:30.289809 ignition[947]: Ignition 2.22.0 Jan 15 01:59:30.291126 ignition[947]: Stage: fetch Jan 15 01:59:30.291468 ignition[947]: no configs at "/usr/lib/ignition/base.d" Jan 15 01:59:30.291494 ignition[947]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 01:59:30.291676 ignition[947]: parsed url from cmdline: "" Jan 15 01:59:30.291685 ignition[947]: no config URL provided Jan 15 01:59:30.291696 ignition[947]: reading system config file "/usr/lib/ignition/user.ign" Jan 15 01:59:30.291710 ignition[947]: no config at "/usr/lib/ignition/user.ign" Jan 15 01:59:30.291885 ignition[947]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 15 01:59:30.291937 ignition[947]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 15 01:59:30.291949 ignition[947]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 15 01:59:30.814696 ignition[947]: GET result: OK Jan 15 01:59:30.815414 ignition[947]: parsing config with SHA512: 55918d1258646fa4a5abb21bc8479d74e6077b66cb02e1a06e050e38710bf64c5054ccc37c9eec8f358e56cc695e33a504675d1fb7d164618f0ba377604567b5 Jan 15 01:59:30.832400 unknown[947]: fetched base config from "system" Jan 15 01:59:30.832424 unknown[947]: fetched base config from "system" Jan 15 01:59:30.832438 unknown[947]: fetched user config from "openstack" Jan 15 01:59:30.835430 ignition[947]: fetch: fetch complete Jan 15 01:59:30.835444 ignition[947]: fetch: fetch passed Jan 15 01:59:30.835568 ignition[947]: Ignition finished successfully Jan 15 01:59:30.843000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:30.841382 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 15 01:59:30.847138 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 15 01:59:30.914402 ignition[954]: Ignition 2.22.0 Jan 15 01:59:30.914422 ignition[954]: Stage: kargs Jan 15 01:59:30.914683 ignition[954]: no configs at "/usr/lib/ignition/base.d" Jan 15 01:59:30.914701 ignition[954]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 01:59:30.916616 ignition[954]: kargs: kargs passed Jan 15 01:59:30.916686 ignition[954]: Ignition finished successfully Jan 15 01:59:30.920369 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 15 01:59:30.922000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:30.925290 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 15 01:59:30.976568 ignition[961]: Ignition 2.22.0 Jan 15 01:59:30.977175 ignition[961]: Stage: disks Jan 15 01:59:30.977598 ignition[961]: no configs at "/usr/lib/ignition/base.d" Jan 15 01:59:30.977624 ignition[961]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 01:59:30.980160 ignition[961]: disks: disks passed Jan 15 01:59:30.980302 ignition[961]: Ignition finished successfully Jan 15 01:59:30.983847 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 15 01:59:30.984000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:30.985565 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 15 01:59:30.986759 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 15 01:59:30.988380 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 15 01:59:30.990012 systemd[1]: Reached target sysinit.target - System Initialization. Jan 15 01:59:30.991764 systemd[1]: Reached target basic.target - Basic System. Jan 15 01:59:30.995613 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 15 01:59:31.085301 systemd-fsck[970]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 15 01:59:31.092177 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 15 01:59:31.093000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:31.097960 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 15 01:59:31.376100 kernel: EXT4-fs (vda9): mounted filesystem 6f459a58-5046-4124-bfbc-09321f1e67d8 r/w with ordered data mode. Quota mode: none. Jan 15 01:59:31.377928 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 15 01:59:31.379726 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 15 01:59:31.385847 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 15 01:59:31.389288 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 15 01:59:31.392659 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 15 01:59:31.404050 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 15 01:59:31.407308 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 15 01:59:31.410287 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 15 01:59:31.427870 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 15 01:59:31.433363 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 15 01:59:31.449110 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (978) Jan 15 01:59:31.454083 kernel: BTRFS info (device vda6): first mount of filesystem 372d586b-dfcb-4c9b-8d15-cc0618567790 Jan 15 01:59:31.458082 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 15 01:59:31.483265 kernel: BTRFS info (device vda6): turning on async discard Jan 15 01:59:31.483340 kernel: BTRFS info (device vda6): enabling free space tree Jan 15 01:59:31.495507 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 15 01:59:31.559129 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:59:31.578685 initrd-setup-root[1006]: cut: /sysroot/etc/passwd: No such file or directory Jan 15 01:59:31.589205 initrd-setup-root[1013]: cut: /sysroot/etc/group: No such file or directory Jan 15 01:59:31.598082 initrd-setup-root[1020]: cut: /sysroot/etc/shadow: No such file or directory Jan 15 01:59:31.604641 initrd-setup-root[1027]: cut: /sysroot/etc/gshadow: No such file or directory Jan 15 01:59:31.779471 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 15 01:59:31.781000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:31.783795 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 15 01:59:31.787248 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 15 01:59:31.820552 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 15 01:59:31.826799 kernel: BTRFS info (device vda6): last unmount of filesystem 372d586b-dfcb-4c9b-8d15-cc0618567790 Jan 15 01:59:31.866584 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 15 01:59:31.867000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:31.883079 ignition[1095]: INFO : Ignition 2.22.0 Jan 15 01:59:31.883079 ignition[1095]: INFO : Stage: mount Jan 15 01:59:31.883079 ignition[1095]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 15 01:59:31.883079 ignition[1095]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 01:59:31.886966 ignition[1095]: INFO : mount: mount passed Jan 15 01:59:31.887739 ignition[1095]: INFO : Ignition finished successfully Jan 15 01:59:31.889361 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 15 01:59:31.890000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:32.628120 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:59:34.642119 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:59:38.658135 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:59:38.670214 coreos-metadata[980]: Jan 15 01:59:38.669 WARN failed to locate config-drive, using the metadata service API instead Jan 15 01:59:38.718266 coreos-metadata[980]: Jan 15 01:59:38.718 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 15 01:59:38.871608 coreos-metadata[980]: Jan 15 01:59:38.871 INFO Fetch successful Jan 15 01:59:38.873160 coreos-metadata[980]: Jan 15 01:59:38.873 INFO wrote hostname ci-4515-1-0-n-e5e35ee394 to /sysroot/etc/hostname Jan 15 01:59:38.875772 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 15 01:59:38.906370 kernel: kauditd_printk_skb: 19 callbacks suppressed Jan 15 01:59:38.906437 kernel: audit: type=1130 audit(1768442378.876:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:38.906480 kernel: audit: type=1131 audit(1768442378.876:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:38.876000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:38.876000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:38.876030 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 15 01:59:38.882108 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 15 01:59:38.929361 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 15 01:59:38.990129 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1111) Jan 15 01:59:38.998105 kernel: BTRFS info (device vda6): first mount of filesystem 372d586b-dfcb-4c9b-8d15-cc0618567790 Jan 15 01:59:39.004142 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 15 01:59:39.017944 kernel: BTRFS info (device vda6): turning on async discard Jan 15 01:59:39.018015 kernel: BTRFS info (device vda6): enabling free space tree Jan 15 01:59:39.023361 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 15 01:59:39.095702 ignition[1129]: INFO : Ignition 2.22.0 Jan 15 01:59:39.095702 ignition[1129]: INFO : Stage: files Jan 15 01:59:39.104957 ignition[1129]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 15 01:59:39.104957 ignition[1129]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 01:59:39.104957 ignition[1129]: DEBUG : files: compiled without relabeling support, skipping Jan 15 01:59:39.104957 ignition[1129]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 15 01:59:39.104957 ignition[1129]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 15 01:59:39.163505 ignition[1129]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 15 01:59:39.164988 ignition[1129]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 15 01:59:39.166892 ignition[1129]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 15 01:59:39.165831 unknown[1129]: wrote ssh authorized keys file for user: core Jan 15 01:59:39.176620 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 15 01:59:39.179025 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Jan 15 01:59:39.259791 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 15 01:59:39.375755 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jan 15 01:59:39.377048 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 15 01:59:39.377048 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 15 01:59:39.377048 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 15 01:59:39.377048 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 15 01:59:39.377048 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 15 01:59:39.377048 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 15 01:59:39.377048 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 15 01:59:39.377048 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 15 01:59:39.380989 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 15 01:59:39.380989 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 15 01:59:39.380989 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 15 01:59:39.382833 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 15 01:59:39.382833 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 15 01:59:39.382833 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Jan 15 01:59:39.748770 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 15 01:59:40.919076 ignition[1129]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jan 15 01:59:40.919076 ignition[1129]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 15 01:59:40.923041 ignition[1129]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 15 01:59:40.925629 ignition[1129]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 15 01:59:40.925629 ignition[1129]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 15 01:59:40.928359 ignition[1129]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 15 01:59:40.928359 ignition[1129]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 15 01:59:40.928359 ignition[1129]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 15 01:59:40.928359 ignition[1129]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 15 01:59:40.928359 ignition[1129]: INFO : files: files passed Jan 15 01:59:40.928359 ignition[1129]: INFO : Ignition finished successfully Jan 15 01:59:40.936672 kernel: audit: type=1130 audit(1768442380.928:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:40.928000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:40.928178 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 15 01:59:40.930537 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 15 01:59:40.940189 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 15 01:59:40.944356 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 15 01:59:40.953000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:40.953278 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 15 01:59:40.958494 kernel: audit: type=1130 audit(1768442380.953:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:40.957000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:40.964089 kernel: audit: type=1131 audit(1768442380.957:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:40.970910 initrd-setup-root-after-ignition[1164]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 15 01:59:40.971718 initrd-setup-root-after-ignition[1160]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 15 01:59:40.971718 initrd-setup-root-after-ignition[1160]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 15 01:59:40.974015 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 15 01:59:40.975702 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 15 01:59:40.981409 kernel: audit: type=1130 audit(1768442380.975:48): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:40.975000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:40.981713 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 15 01:59:41.039859 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 15 01:59:41.040046 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 15 01:59:41.058553 kernel: audit: type=1130 audit(1768442381.041:49): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.058597 kernel: audit: type=1131 audit(1768442381.041:50): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.041000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.041000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.042244 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 15 01:59:41.059477 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 15 01:59:41.061641 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 15 01:59:41.062980 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 15 01:59:41.103000 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 15 01:59:41.113156 kernel: audit: type=1130 audit(1768442381.103:51): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.103000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.107237 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 15 01:59:41.151805 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 15 01:59:41.152304 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 15 01:59:41.153673 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 15 01:59:41.156289 systemd[1]: Stopped target timers.target - Timer Units. Jan 15 01:59:41.172373 kernel: audit: type=1131 audit(1768442381.161:52): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.161000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.158785 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 15 01:59:41.159011 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 15 01:59:41.172532 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 15 01:59:41.174265 systemd[1]: Stopped target basic.target - Basic System. Jan 15 01:59:41.177008 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 15 01:59:41.179729 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 15 01:59:41.182474 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 15 01:59:41.185192 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 15 01:59:41.187873 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 15 01:59:41.190551 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 15 01:59:41.193281 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 15 01:59:41.195947 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 15 01:59:41.203000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.198632 systemd[1]: Stopped target swap.target - Swaps. Jan 15 01:59:41.201265 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 15 01:59:41.201534 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 15 01:59:41.205256 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 15 01:59:41.207997 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 15 01:59:41.215000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.210475 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 15 01:59:41.210663 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 15 01:59:41.219000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.213015 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 15 01:59:41.221000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.213297 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 15 01:59:41.216878 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 15 01:59:41.217206 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 15 01:59:41.219548 systemd[1]: ignition-files.service: Deactivated successfully. Jan 15 01:59:41.219769 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 15 01:59:41.224529 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 15 01:59:41.232471 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 15 01:59:41.234354 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 15 01:59:41.237000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.234589 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 15 01:59:41.237902 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 15 01:59:41.239238 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 15 01:59:41.242000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.242382 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 15 01:59:41.243484 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 15 01:59:41.245000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.257853 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 15 01:59:41.258000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.258000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.258036 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 15 01:59:41.280608 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 15 01:59:41.287517 ignition[1184]: INFO : Ignition 2.22.0 Jan 15 01:59:41.287517 ignition[1184]: INFO : Stage: umount Jan 15 01:59:41.288993 ignition[1184]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 15 01:59:41.288993 ignition[1184]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 15 01:59:41.290256 ignition[1184]: INFO : umount: umount passed Jan 15 01:59:41.291792 ignition[1184]: INFO : Ignition finished successfully Jan 15 01:59:41.293830 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 15 01:59:41.294023 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 15 01:59:41.295000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.295690 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 15 01:59:41.296000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.295758 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 15 01:59:41.298000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.296940 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 15 01:59:41.299000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.297010 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 15 01:59:41.298201 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 15 01:59:41.302000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.298270 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 15 01:59:41.299554 systemd[1]: Stopped target network.target - Network. Jan 15 01:59:41.300812 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 15 01:59:41.300897 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 15 01:59:41.302225 systemd[1]: Stopped target paths.target - Path Units. Jan 15 01:59:41.303549 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 15 01:59:41.307129 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 15 01:59:41.311551 systemd[1]: Stopped target slices.target - Slice Units. Jan 15 01:59:41.312818 systemd[1]: Stopped target sockets.target - Socket Units. Jan 15 01:59:41.313841 systemd[1]: iscsid.socket: Deactivated successfully. Jan 15 01:59:41.318000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.313877 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 15 01:59:41.321000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.314747 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 15 01:59:41.323000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.314774 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 15 01:59:41.315652 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 15 01:59:41.315672 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 15 01:59:41.316634 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 15 01:59:41.316682 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 15 01:59:41.318288 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 15 01:59:41.328000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.318323 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 15 01:59:41.321202 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 15 01:59:41.329000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.322219 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 15 01:59:41.323395 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 15 01:59:41.323486 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 15 01:59:41.324792 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 15 01:59:41.324835 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 15 01:59:41.328726 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 15 01:59:41.328859 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 15 01:59:41.332000 audit: BPF prog-id=9 op=UNLOAD Jan 15 01:59:41.332693 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 15 01:59:41.333659 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 15 01:59:41.333725 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 15 01:59:41.337000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.336324 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 15 01:59:41.336895 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 15 01:59:41.339000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.336958 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 15 01:59:41.337650 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 15 01:59:41.338495 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 15 01:59:41.338600 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 15 01:59:41.344000 audit: BPF prog-id=6 op=UNLOAD Jan 15 01:59:41.344312 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 15 01:59:41.344385 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 15 01:59:41.345000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.345353 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 15 01:59:41.346000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.345405 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 15 01:59:41.348421 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 15 01:59:41.348594 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 15 01:59:41.352000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.354000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.354000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.353131 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 15 01:59:41.353195 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 15 01:59:41.353612 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 15 01:59:41.353638 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 15 01:59:41.353999 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 15 01:59:41.354041 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 15 01:59:41.357000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.354501 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 15 01:59:41.354537 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 15 01:59:41.354944 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 15 01:59:41.359000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.354980 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 15 01:59:41.358574 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 15 01:59:41.359290 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 15 01:59:41.359336 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 15 01:59:41.360093 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 15 01:59:41.361000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.362000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.360130 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 15 01:59:41.361288 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 15 01:59:41.361323 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 15 01:59:41.362810 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 15 01:59:41.362847 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 15 01:59:41.365132 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 15 01:59:41.365168 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 01:59:41.365000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.365000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.378441 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 15 01:59:41.379000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.378954 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 15 01:59:41.381677 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 15 01:59:41.382185 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 15 01:59:41.382000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.382000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:41.383467 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 15 01:59:41.384950 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 15 01:59:41.397254 systemd[1]: Switching root. Jan 15 01:59:41.432090 systemd-journald[342]: Received SIGTERM from PID 1 (systemd). Jan 15 01:59:41.432150 systemd-journald[342]: Journal stopped Jan 15 01:59:42.994052 kernel: SELinux: policy capability network_peer_controls=1 Jan 15 01:59:42.997715 kernel: SELinux: policy capability open_perms=1 Jan 15 01:59:42.997735 kernel: SELinux: policy capability extended_socket_class=1 Jan 15 01:59:42.997754 kernel: SELinux: policy capability always_check_network=0 Jan 15 01:59:42.997766 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 15 01:59:42.997780 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 15 01:59:42.997792 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 15 01:59:42.997802 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 15 01:59:42.997813 kernel: SELinux: policy capability userspace_initial_context=0 Jan 15 01:59:42.997831 systemd[1]: Successfully loaded SELinux policy in 62.959ms. Jan 15 01:59:42.997849 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 13.200ms. Jan 15 01:59:42.997865 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 15 01:59:42.997877 systemd[1]: Detected virtualization kvm. Jan 15 01:59:42.997889 systemd[1]: Detected architecture x86-64. Jan 15 01:59:42.997900 systemd[1]: Detected first boot. Jan 15 01:59:42.997916 systemd[1]: Hostname set to . Jan 15 01:59:42.997929 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 15 01:59:42.997941 zram_generator::config[1227]: No configuration found. Jan 15 01:59:42.997961 kernel: Guest personality initialized and is inactive Jan 15 01:59:42.997973 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Jan 15 01:59:42.997985 kernel: Initialized host personality Jan 15 01:59:42.997995 kernel: NET: Registered PF_VSOCK protocol family Jan 15 01:59:42.998006 systemd[1]: Populated /etc with preset unit settings. Jan 15 01:59:42.998020 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 15 01:59:42.998031 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 15 01:59:42.998043 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 15 01:59:42.998072 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 15 01:59:42.998088 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 15 01:59:42.998100 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 15 01:59:42.998112 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 15 01:59:42.998129 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 15 01:59:42.998143 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 15 01:59:42.998156 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 15 01:59:42.998168 systemd[1]: Created slice user.slice - User and Session Slice. Jan 15 01:59:42.998179 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 15 01:59:42.998190 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 15 01:59:42.998201 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 15 01:59:42.998217 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 15 01:59:42.998228 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 15 01:59:42.998239 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 15 01:59:42.998251 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 15 01:59:42.998264 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 15 01:59:42.998276 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 15 01:59:42.998288 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 15 01:59:42.998299 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 15 01:59:42.998311 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 15 01:59:42.998323 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 15 01:59:42.998337 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 15 01:59:42.998348 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 15 01:59:42.998361 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 15 01:59:42.998372 systemd[1]: Reached target slices.target - Slice Units. Jan 15 01:59:42.998383 systemd[1]: Reached target swap.target - Swaps. Jan 15 01:59:42.998395 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 15 01:59:42.998406 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 15 01:59:42.998418 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 15 01:59:42.998430 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 15 01:59:42.998443 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 15 01:59:42.998454 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 15 01:59:42.998466 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 15 01:59:42.998477 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 15 01:59:42.998489 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 15 01:59:42.998500 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 15 01:59:42.998512 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 15 01:59:42.998524 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 15 01:59:42.998536 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 15 01:59:42.998547 systemd[1]: Mounting media.mount - External Media Directory... Jan 15 01:59:42.998559 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 01:59:42.998571 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 15 01:59:42.998582 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 15 01:59:42.998593 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 15 01:59:42.998607 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 15 01:59:42.998618 systemd[1]: Reached target machines.target - Containers. Jan 15 01:59:42.998629 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 15 01:59:42.998640 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 15 01:59:42.998651 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 15 01:59:42.998663 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 15 01:59:42.998675 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 15 01:59:42.998686 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 15 01:59:42.998697 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 15 01:59:42.998709 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 15 01:59:42.998720 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 15 01:59:42.998731 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 15 01:59:42.998742 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 15 01:59:42.998755 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 15 01:59:42.998766 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 15 01:59:42.998780 systemd[1]: Stopped systemd-fsck-usr.service. Jan 15 01:59:42.998792 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 15 01:59:42.998805 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 15 01:59:42.998816 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 15 01:59:42.998827 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 15 01:59:42.998838 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 15 01:59:42.998850 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 15 01:59:42.998861 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 15 01:59:42.998875 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 01:59:42.998889 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 15 01:59:42.998899 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 15 01:59:42.998911 systemd[1]: Mounted media.mount - External Media Directory. Jan 15 01:59:42.998921 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 15 01:59:42.998935 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 15 01:59:42.998945 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 15 01:59:42.998956 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 15 01:59:42.998968 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 15 01:59:42.998979 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 15 01:59:42.998990 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 15 01:59:42.999000 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 15 01:59:42.999013 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 15 01:59:42.999024 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 15 01:59:42.999036 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 15 01:59:42.999046 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 15 01:59:43.000203 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 15 01:59:43.000223 kernel: fuse: init (API version 7.41) Jan 15 01:59:43.000234 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 15 01:59:43.000250 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 15 01:59:43.000261 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 15 01:59:43.000273 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 15 01:59:43.000285 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 15 01:59:43.000296 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 15 01:59:43.000307 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 15 01:59:43.000320 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 15 01:59:43.000332 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 15 01:59:43.000345 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 15 01:59:43.000357 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 15 01:59:43.000369 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 15 01:59:43.000401 systemd-journald[1295]: Collecting audit messages is enabled. Jan 15 01:59:43.000428 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 15 01:59:43.000439 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 15 01:59:43.000453 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 15 01:59:43.000466 systemd-journald[1295]: Journal started Jan 15 01:59:43.000491 systemd-journald[1295]: Runtime Journal (/run/log/journal/daf2ebc79e064bccabee4c74708a9509) is 8M, max 77.9M, 69.9M free. Jan 15 01:59:42.746000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 15 01:59:42.865000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:42.867000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:42.870000 audit: BPF prog-id=14 op=UNLOAD Jan 15 01:59:42.870000 audit: BPF prog-id=13 op=UNLOAD Jan 15 01:59:42.871000 audit: BPF prog-id=15 op=LOAD Jan 15 01:59:42.871000 audit: BPF prog-id=16 op=LOAD Jan 15 01:59:42.871000 audit: BPF prog-id=17 op=LOAD Jan 15 01:59:42.932000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:42.935000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:42.935000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:42.939000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:42.939000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:42.942000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:42.942000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:42.946000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:42.946000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:42.949000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:42.952000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:42.955000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:42.990000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 15 01:59:42.990000 audit[1295]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=5 a1=7ffe53830480 a2=4000 a3=0 items=0 ppid=1 pid=1295 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:59:42.990000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 15 01:59:42.650874 systemd[1]: Queued start job for default target multi-user.target. Jan 15 01:59:42.675923 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 15 01:59:42.676327 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 15 01:59:43.011076 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 15 01:59:43.024219 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 15 01:59:43.024254 systemd[1]: Started systemd-journald.service - Journal Service. Jan 15 01:59:43.025000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.028000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.028000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.029000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.027814 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 15 01:59:43.027977 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 15 01:59:43.028768 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 15 01:59:43.029877 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 15 01:59:43.031000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.031374 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 15 01:59:43.043317 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 15 01:59:43.048095 kernel: ACPI: bus type drm_connector registered Jan 15 01:59:43.050305 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 15 01:59:43.054336 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 15 01:59:43.056636 kernel: loop1: detected capacity change from 0 to 119256 Jan 15 01:59:43.056494 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 15 01:59:43.058000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.058000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.058104 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 15 01:59:43.073437 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 15 01:59:43.074000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.087827 systemd-journald[1295]: Time spent on flushing to /var/log/journal/daf2ebc79e064bccabee4c74708a9509 is 59.593ms for 1856 entries. Jan 15 01:59:43.087827 systemd-journald[1295]: System Journal (/var/log/journal/daf2ebc79e064bccabee4c74708a9509) is 8M, max 588.1M, 580.1M free. Jan 15 01:59:43.161305 systemd-journald[1295]: Received client request to flush runtime journal. Jan 15 01:59:43.161342 kernel: loop2: detected capacity change from 0 to 224512 Jan 15 01:59:43.096000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.105000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.115000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.163000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.094545 systemd-tmpfiles[1324]: ACLs are not supported, ignoring. Jan 15 01:59:43.094557 systemd-tmpfiles[1324]: ACLs are not supported, ignoring. Jan 15 01:59:43.095017 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 15 01:59:43.104441 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 15 01:59:43.109329 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 15 01:59:43.115387 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 15 01:59:43.162852 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 15 01:59:43.178333 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 15 01:59:43.178000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.191086 kernel: loop3: detected capacity change from 0 to 1656 Jan 15 01:59:43.200001 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 15 01:59:43.200000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.201000 audit: BPF prog-id=18 op=LOAD Jan 15 01:59:43.202000 audit: BPF prog-id=19 op=LOAD Jan 15 01:59:43.202000 audit: BPF prog-id=20 op=LOAD Jan 15 01:59:43.203193 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 15 01:59:43.204000 audit: BPF prog-id=21 op=LOAD Jan 15 01:59:43.207195 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 15 01:59:43.210236 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 15 01:59:43.222130 kernel: loop4: detected capacity change from 0 to 111544 Jan 15 01:59:43.226000 audit: BPF prog-id=22 op=LOAD Jan 15 01:59:43.226000 audit: BPF prog-id=23 op=LOAD Jan 15 01:59:43.227000 audit: BPF prog-id=24 op=LOAD Jan 15 01:59:43.228243 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 15 01:59:43.231000 audit: BPF prog-id=25 op=LOAD Jan 15 01:59:43.232000 audit: BPF prog-id=26 op=LOAD Jan 15 01:59:43.232000 audit: BPF prog-id=27 op=LOAD Jan 15 01:59:43.232835 systemd-tmpfiles[1378]: ACLs are not supported, ignoring. Jan 15 01:59:43.232860 systemd-tmpfiles[1378]: ACLs are not supported, ignoring. Jan 15 01:59:43.233227 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 15 01:59:43.238650 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 15 01:59:43.240000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.292290 systemd-nsresourced[1381]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 15 01:59:43.295070 kernel: loop5: detected capacity change from 0 to 119256 Jan 15 01:59:43.294000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.293532 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 15 01:59:43.295000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.295457 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 15 01:59:43.323513 kernel: loop6: detected capacity change from 0 to 224512 Jan 15 01:59:43.363079 kernel: loop7: detected capacity change from 0 to 1656 Jan 15 01:59:43.366893 systemd-oomd[1376]: No swap; memory pressure usage will be degraded Jan 15 01:59:43.367546 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 15 01:59:43.368000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.374881 kernel: loop1: detected capacity change from 0 to 111544 Jan 15 01:59:43.394390 (sd-merge)[1384]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Jan 15 01:59:43.397205 (sd-merge)[1384]: Merged extensions into '/usr'. Jan 15 01:59:43.401786 systemd[1]: Reload requested from client PID 1323 ('systemd-sysext') (unit systemd-sysext.service)... Jan 15 01:59:43.401802 systemd[1]: Reloading... Jan 15 01:59:43.402146 systemd-resolved[1377]: Positive Trust Anchors: Jan 15 01:59:43.403084 systemd-resolved[1377]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 15 01:59:43.403091 systemd-resolved[1377]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 15 01:59:43.403123 systemd-resolved[1377]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 15 01:59:43.431579 systemd-resolved[1377]: Using system hostname 'ci-4515-1-0-n-e5e35ee394'. Jan 15 01:59:43.471082 zram_generator::config[1428]: No configuration found. Jan 15 01:59:43.657293 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 15 01:59:43.657462 systemd[1]: Reloading finished in 255 ms. Jan 15 01:59:43.689000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.688798 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 15 01:59:43.689793 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 15 01:59:43.690000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.693591 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 15 01:59:43.694000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:43.695260 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 15 01:59:43.696884 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 15 01:59:43.704179 systemd[1]: Starting ensure-sysext.service... Jan 15 01:59:43.705000 audit: BPF prog-id=8 op=UNLOAD Jan 15 01:59:43.705000 audit: BPF prog-id=7 op=UNLOAD Jan 15 01:59:43.705600 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 15 01:59:43.706000 audit: BPF prog-id=28 op=LOAD Jan 15 01:59:43.706000 audit: BPF prog-id=29 op=LOAD Jan 15 01:59:43.708722 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 15 01:59:43.717000 audit: BPF prog-id=30 op=LOAD Jan 15 01:59:43.717000 audit: BPF prog-id=22 op=UNLOAD Jan 15 01:59:43.717000 audit: BPF prog-id=31 op=LOAD Jan 15 01:59:43.717000 audit: BPF prog-id=32 op=LOAD Jan 15 01:59:43.717000 audit: BPF prog-id=23 op=UNLOAD Jan 15 01:59:43.717000 audit: BPF prog-id=24 op=UNLOAD Jan 15 01:59:43.718000 audit: BPF prog-id=33 op=LOAD Jan 15 01:59:43.718000 audit: BPF prog-id=15 op=UNLOAD Jan 15 01:59:43.718000 audit: BPF prog-id=34 op=LOAD Jan 15 01:59:43.718000 audit: BPF prog-id=35 op=LOAD Jan 15 01:59:43.718000 audit: BPF prog-id=16 op=UNLOAD Jan 15 01:59:43.718000 audit: BPF prog-id=17 op=UNLOAD Jan 15 01:59:43.718000 audit: BPF prog-id=36 op=LOAD Jan 15 01:59:43.718000 audit: BPF prog-id=25 op=UNLOAD Jan 15 01:59:43.719000 audit: BPF prog-id=37 op=LOAD Jan 15 01:59:43.719000 audit: BPF prog-id=38 op=LOAD Jan 15 01:59:43.719000 audit: BPF prog-id=26 op=UNLOAD Jan 15 01:59:43.719000 audit: BPF prog-id=27 op=UNLOAD Jan 15 01:59:43.719000 audit: BPF prog-id=39 op=LOAD Jan 15 01:59:43.719000 audit: BPF prog-id=21 op=UNLOAD Jan 15 01:59:43.720000 audit: BPF prog-id=40 op=LOAD Jan 15 01:59:43.720000 audit: BPF prog-id=18 op=UNLOAD Jan 15 01:59:43.720000 audit: BPF prog-id=41 op=LOAD Jan 15 01:59:43.720000 audit: BPF prog-id=42 op=LOAD Jan 15 01:59:43.720000 audit: BPF prog-id=19 op=UNLOAD Jan 15 01:59:43.720000 audit: BPF prog-id=20 op=UNLOAD Jan 15 01:59:43.723528 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 15 01:59:43.732204 systemd[1]: Reload requested from client PID 1472 ('systemctl') (unit ensure-sysext.service)... Jan 15 01:59:43.732219 systemd[1]: Reloading... Jan 15 01:59:43.764406 systemd-tmpfiles[1473]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 15 01:59:43.765130 systemd-tmpfiles[1473]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 15 01:59:43.765490 systemd-tmpfiles[1473]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 15 01:59:43.768615 systemd-tmpfiles[1473]: ACLs are not supported, ignoring. Jan 15 01:59:43.768821 systemd-tmpfiles[1473]: ACLs are not supported, ignoring. Jan 15 01:59:43.774432 systemd-udevd[1474]: Using default interface naming scheme 'v257'. Jan 15 01:59:43.782163 systemd-tmpfiles[1473]: Detected autofs mount point /boot during canonicalization of boot. Jan 15 01:59:43.782172 systemd-tmpfiles[1473]: Skipping /boot Jan 15 01:59:43.795114 systemd-tmpfiles[1473]: Detected autofs mount point /boot during canonicalization of boot. Jan 15 01:59:43.795176 systemd-tmpfiles[1473]: Skipping /boot Jan 15 01:59:43.811094 zram_generator::config[1513]: No configuration found. Jan 15 01:59:43.965104 kernel: mousedev: PS/2 mouse device common for all mice Jan 15 01:59:43.981075 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Jan 15 01:59:43.987125 kernel: ACPI: button: Power Button [PWRF] Jan 15 01:59:44.036071 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Jan 15 01:59:44.036103 kernel: Console: switching to colour dummy device 80x25 Jan 15 01:59:44.036116 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Jan 15 01:59:44.036310 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 15 01:59:44.036324 kernel: [drm] features: -context_init Jan 15 01:59:44.037198 kernel: [drm] number of scanouts: 1 Jan 15 01:59:44.037216 kernel: [drm] number of cap sets: 0 Jan 15 01:59:44.038073 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jan 15 01:59:44.047168 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 15 01:59:44.047448 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 15 01:59:44.047544 systemd[1]: Reloading finished in 315 ms. Jan 15 01:59:44.057901 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Jan 15 01:59:44.057936 kernel: Console: switching to colour frame buffer device 160x50 Jan 15 01:59:44.059957 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 15 01:59:44.065185 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 15 01:59:44.069000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:44.072071 kernel: kauditd_printk_skb: 137 callbacks suppressed Jan 15 01:59:44.072100 kernel: audit: type=1130 audit(1768442384.069:188): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:44.070000 audit: BPF prog-id=43 op=LOAD Jan 15 01:59:44.070000 audit: BPF prog-id=33 op=UNLOAD Jan 15 01:59:44.075120 kernel: audit: type=1334 audit(1768442384.070:189): prog-id=43 op=LOAD Jan 15 01:59:44.075149 kernel: audit: type=1334 audit(1768442384.070:190): prog-id=33 op=UNLOAD Jan 15 01:59:44.075162 kernel: audit: type=1334 audit(1768442384.070:191): prog-id=44 op=LOAD Jan 15 01:59:44.070000 audit: BPF prog-id=44 op=LOAD Jan 15 01:59:44.076589 kernel: audit: type=1334 audit(1768442384.070:192): prog-id=45 op=LOAD Jan 15 01:59:44.070000 audit: BPF prog-id=45 op=LOAD Jan 15 01:59:44.070000 audit: BPF prog-id=34 op=UNLOAD Jan 15 01:59:44.079250 kernel: audit: type=1334 audit(1768442384.070:193): prog-id=34 op=UNLOAD Jan 15 01:59:44.079574 kernel: audit: type=1334 audit(1768442384.070:194): prog-id=35 op=UNLOAD Jan 15 01:59:44.070000 audit: BPF prog-id=35 op=UNLOAD Jan 15 01:59:44.079962 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 15 01:59:44.083301 kernel: audit: type=1334 audit(1768442384.071:195): prog-id=46 op=LOAD Jan 15 01:59:44.083330 kernel: audit: type=1334 audit(1768442384.071:196): prog-id=40 op=UNLOAD Jan 15 01:59:44.083343 kernel: audit: type=1334 audit(1768442384.071:197): prog-id=47 op=LOAD Jan 15 01:59:44.071000 audit: BPF prog-id=46 op=LOAD Jan 15 01:59:44.071000 audit: BPF prog-id=40 op=UNLOAD Jan 15 01:59:44.071000 audit: BPF prog-id=47 op=LOAD Jan 15 01:59:44.071000 audit: BPF prog-id=48 op=LOAD Jan 15 01:59:44.071000 audit: BPF prog-id=41 op=UNLOAD Jan 15 01:59:44.071000 audit: BPF prog-id=42 op=UNLOAD Jan 15 01:59:44.071000 audit: BPF prog-id=49 op=LOAD Jan 15 01:59:44.071000 audit: BPF prog-id=39 op=UNLOAD Jan 15 01:59:44.073000 audit: BPF prog-id=50 op=LOAD Jan 15 01:59:44.073000 audit: BPF prog-id=51 op=LOAD Jan 15 01:59:44.073000 audit: BPF prog-id=28 op=UNLOAD Jan 15 01:59:44.073000 audit: BPF prog-id=29 op=UNLOAD Jan 15 01:59:44.073000 audit: BPF prog-id=52 op=LOAD Jan 15 01:59:44.073000 audit: BPF prog-id=36 op=UNLOAD Jan 15 01:59:44.073000 audit: BPF prog-id=53 op=LOAD Jan 15 01:59:44.073000 audit: BPF prog-id=54 op=LOAD Jan 15 01:59:44.073000 audit: BPF prog-id=37 op=UNLOAD Jan 15 01:59:44.073000 audit: BPF prog-id=38 op=UNLOAD Jan 15 01:59:44.076000 audit: BPF prog-id=55 op=LOAD Jan 15 01:59:44.076000 audit: BPF prog-id=30 op=UNLOAD Jan 15 01:59:44.076000 audit: BPF prog-id=56 op=LOAD Jan 15 01:59:44.076000 audit: BPF prog-id=57 op=LOAD Jan 15 01:59:44.076000 audit: BPF prog-id=31 op=UNLOAD Jan 15 01:59:44.076000 audit: BPF prog-id=32 op=UNLOAD Jan 15 01:59:44.089000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:44.123405 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Jan 15 01:59:44.123597 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 15 01:59:44.123732 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 15 01:59:44.129739 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 01:59:44.132032 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 15 01:59:44.134733 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 15 01:59:44.134956 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 15 01:59:44.140238 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 15 01:59:44.141488 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 15 01:59:44.148422 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 15 01:59:44.150217 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 15 01:59:44.150365 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 15 01:59:44.152254 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 15 01:59:44.160253 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 15 01:59:44.165838 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 15 01:59:44.171000 audit: BPF prog-id=58 op=LOAD Jan 15 01:59:44.167171 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 15 01:59:44.174149 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 15 01:59:44.177270 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 15 01:59:44.177713 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 01:59:44.183000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:44.183000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:44.181228 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 15 01:59:44.181417 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 15 01:59:44.186725 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 01:59:44.186944 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 15 01:59:44.196305 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 15 01:59:44.198378 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 15 01:59:44.198526 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 15 01:59:44.198607 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 15 01:59:44.198686 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 01:59:44.204037 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 01:59:44.205487 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 15 01:59:44.209115 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 15 01:59:44.213014 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 15 01:59:44.213996 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 15 01:59:44.215281 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 15 01:59:44.215364 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 15 01:59:44.215499 systemd[1]: Reached target time-set.target - System Time Set. Jan 15 01:59:44.217051 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 15 01:59:44.222284 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 15 01:59:44.237000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:44.237000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:44.239000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:44.239000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:44.235418 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 15 01:59:44.237874 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 15 01:59:44.238027 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 15 01:59:44.252000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:44.249767 systemd[1]: Finished ensure-sysext.service. Jan 15 01:59:44.262491 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 15 01:59:44.265000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:44.270515 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 15 01:59:44.275558 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 01:59:44.279562 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 15 01:59:44.279803 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 15 01:59:44.285000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:44.285000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:44.286000 audit[1601]: SYSTEM_BOOT pid=1601 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 15 01:59:44.286890 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 15 01:59:44.301000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:44.299340 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 15 01:59:44.320000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:44.320000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 01:59:44.318976 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 15 01:59:44.319176 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 15 01:59:44.326819 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 15 01:59:44.326857 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 15 01:59:44.342253 kernel: PTP clock support registered Jan 15 01:59:44.348883 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 15 01:59:44.351000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 15 01:59:44.351000 audit[1640]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fff1160af20 a2=420 a3=0 items=0 ppid=1588 pid=1640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 01:59:44.351000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 15 01:59:44.352327 augenrules[1640]: No rules Jan 15 01:59:44.349162 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 15 01:59:44.354225 systemd[1]: audit-rules.service: Deactivated successfully. Jan 15 01:59:44.355112 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 15 01:59:44.375430 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 15 01:59:44.375614 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 01:59:44.382299 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 15 01:59:44.389294 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 01:59:44.421348 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 15 01:59:44.422107 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 01:59:44.436175 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 15 01:59:44.447430 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 15 01:59:44.451831 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 15 01:59:44.486470 systemd-networkd[1600]: lo: Link UP Jan 15 01:59:44.486477 systemd-networkd[1600]: lo: Gained carrier Jan 15 01:59:44.490489 systemd-networkd[1600]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 15 01:59:44.490497 systemd-networkd[1600]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 15 01:59:44.490531 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 15 01:59:44.491092 systemd[1]: Reached target network.target - Network. Jan 15 01:59:44.491313 systemd-networkd[1600]: eth0: Link UP Jan 15 01:59:44.491444 systemd-networkd[1600]: eth0: Gained carrier Jan 15 01:59:44.491455 systemd-networkd[1600]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 15 01:59:44.493254 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 15 01:59:44.499237 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 15 01:59:44.508115 systemd-networkd[1600]: eth0: DHCPv4 address 10.0.1.164/25, gateway 10.0.1.129 acquired from 10.0.1.129 Jan 15 01:59:44.526750 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 15 01:59:44.568224 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 15 01:59:45.545119 ldconfig[1596]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 15 01:59:45.553851 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 15 01:59:45.561487 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 15 01:59:45.613894 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 15 01:59:45.615776 systemd[1]: Reached target sysinit.target - System Initialization. Jan 15 01:59:45.619282 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 15 01:59:45.621301 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 15 01:59:45.625050 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jan 15 01:59:45.628300 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 15 01:59:45.629786 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 15 01:59:45.631186 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 15 01:59:45.632868 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 15 01:59:45.634239 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 15 01:59:45.635566 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 15 01:59:45.635772 systemd[1]: Reached target paths.target - Path Units. Jan 15 01:59:45.637086 systemd[1]: Reached target timers.target - Timer Units. Jan 15 01:59:45.641719 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 15 01:59:45.646429 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 15 01:59:45.654054 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 15 01:59:45.658441 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 15 01:59:45.659960 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 15 01:59:45.671594 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 15 01:59:45.675961 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 15 01:59:45.680662 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 15 01:59:45.685616 systemd[1]: Reached target sockets.target - Socket Units. Jan 15 01:59:45.688886 systemd[1]: Reached target basic.target - Basic System. Jan 15 01:59:45.690376 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 15 01:59:45.690447 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 15 01:59:45.696628 systemd[1]: Starting chronyd.service - NTP client/server... Jan 15 01:59:45.703284 systemd[1]: Starting containerd.service - containerd container runtime... Jan 15 01:59:45.713713 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 15 01:59:45.724345 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 15 01:59:45.732393 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 15 01:59:45.743380 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 15 01:59:45.752972 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 15 01:59:45.755663 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 15 01:59:45.760127 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:59:45.768363 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jan 15 01:59:45.775118 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 15 01:59:45.782245 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 15 01:59:45.790937 jq[1675]: false Jan 15 01:59:45.791368 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 15 01:59:45.804345 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 15 01:59:45.811966 google_oslogin_nss_cache[1678]: oslogin_cache_refresh[1678]: Refreshing passwd entry cache Jan 15 01:59:45.813113 oslogin_cache_refresh[1678]: Refreshing passwd entry cache Jan 15 01:59:45.813466 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 15 01:59:45.818248 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 15 01:59:45.818886 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 15 01:59:45.820303 systemd[1]: Starting update-engine.service - Update Engine... Jan 15 01:59:45.824290 google_oslogin_nss_cache[1678]: oslogin_cache_refresh[1678]: Failure getting users, quitting Jan 15 01:59:45.824290 google_oslogin_nss_cache[1678]: oslogin_cache_refresh[1678]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 15 01:59:45.824290 google_oslogin_nss_cache[1678]: oslogin_cache_refresh[1678]: Refreshing group entry cache Jan 15 01:59:45.823947 oslogin_cache_refresh[1678]: Failure getting users, quitting Jan 15 01:59:45.823960 oslogin_cache_refresh[1678]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jan 15 01:59:45.823993 oslogin_cache_refresh[1678]: Refreshing group entry cache Jan 15 01:59:45.825032 extend-filesystems[1676]: Found /dev/vda6 Jan 15 01:59:45.828734 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 15 01:59:45.834098 google_oslogin_nss_cache[1678]: oslogin_cache_refresh[1678]: Failure getting groups, quitting Jan 15 01:59:45.834098 google_oslogin_nss_cache[1678]: oslogin_cache_refresh[1678]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 15 01:59:45.833089 oslogin_cache_refresh[1678]: Failure getting groups, quitting Jan 15 01:59:45.833098 oslogin_cache_refresh[1678]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jan 15 01:59:45.834654 extend-filesystems[1676]: Found /dev/vda9 Jan 15 01:59:45.836987 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 15 01:59:45.838789 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 15 01:59:45.839471 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 15 01:59:45.839818 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jan 15 01:59:45.840873 extend-filesystems[1676]: Checking size of /dev/vda9 Jan 15 01:59:45.840940 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jan 15 01:59:45.861880 update_engine[1688]: I20260115 01:59:45.861824 1688 main.cc:92] Flatcar Update Engine starting Jan 15 01:59:45.869572 chronyd[1670]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 15 01:59:45.871418 jq[1689]: true Jan 15 01:59:45.874095 extend-filesystems[1676]: Resized partition /dev/vda9 Jan 15 01:59:45.881488 chronyd[1670]: Loaded seccomp filter (level 2) Jan 15 01:59:45.881606 systemd[1]: Started chronyd.service - NTP client/server. Jan 15 01:59:45.894100 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 15 01:59:45.894308 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 15 01:59:45.900393 extend-filesystems[1707]: resize2fs 1.47.3 (8-Jul-2025) Jan 15 01:59:45.910956 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Jan 15 01:59:45.910512 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 15 01:59:45.908015 dbus-daemon[1673]: [system] SELinux support is enabled Jan 15 01:59:45.914307 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 15 01:59:45.914329 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 15 01:59:45.917144 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 15 01:59:45.917165 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 15 01:59:45.922608 tar[1701]: linux-amd64/LICENSE Jan 15 01:59:45.922774 tar[1701]: linux-amd64/helm Jan 15 01:59:45.926860 systemd[1]: Started update-engine.service - Update Engine. Jan 15 01:59:45.927876 update_engine[1688]: I20260115 01:59:45.927686 1688 update_check_scheduler.cc:74] Next update check in 10m16s Jan 15 01:59:45.932328 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 15 01:59:45.939902 jq[1706]: true Jan 15 01:59:45.968220 systemd[1]: motdgen.service: Deactivated successfully. Jan 15 01:59:45.969171 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 15 01:59:46.080515 systemd-logind[1687]: New seat seat0. Jan 15 01:59:46.083679 systemd-logind[1687]: Watching system buttons on /dev/input/event3 (Power Button) Jan 15 01:59:46.084832 systemd-logind[1687]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 15 01:59:46.085122 systemd[1]: Started systemd-logind.service - User Login Management. Jan 15 01:59:46.128017 locksmithd[1727]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 15 01:59:46.150004 sshd_keygen[1726]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 15 01:59:46.178175 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 15 01:59:46.187261 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 15 01:59:46.201225 systemd[1]: issuegen.service: Deactivated successfully. Jan 15 01:59:46.201443 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 15 01:59:46.204823 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 15 01:59:46.214749 containerd[1710]: time="2026-01-15T01:59:46Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 15 01:59:46.215228 bash[1749]: Updated "/home/core/.ssh/authorized_keys" Jan 15 01:59:46.219629 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 15 01:59:46.221970 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 15 01:59:46.225140 containerd[1710]: time="2026-01-15T01:59:46.224321924Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 15 01:59:46.228360 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 15 01:59:46.232302 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 15 01:59:46.234652 containerd[1710]: time="2026-01-15T01:59:46.234492159Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.063µs" Jan 15 01:59:46.234652 containerd[1710]: time="2026-01-15T01:59:46.234532827Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 15 01:59:46.234652 containerd[1710]: time="2026-01-15T01:59:46.234603148Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 15 01:59:46.234652 containerd[1710]: time="2026-01-15T01:59:46.234617644Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 15 01:59:46.234736 containerd[1710]: time="2026-01-15T01:59:46.234725605Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 15 01:59:46.234759 containerd[1710]: time="2026-01-15T01:59:46.234743898Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 15 01:59:46.234809 containerd[1710]: time="2026-01-15T01:59:46.234791639Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 15 01:59:46.234809 containerd[1710]: time="2026-01-15T01:59:46.234806990Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 15 01:59:46.236084 containerd[1710]: time="2026-01-15T01:59:46.235115155Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 15 01:59:46.236084 containerd[1710]: time="2026-01-15T01:59:46.235135198Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 15 01:59:46.236084 containerd[1710]: time="2026-01-15T01:59:46.235145845Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 15 01:59:46.236084 containerd[1710]: time="2026-01-15T01:59:46.235152493Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 15 01:59:46.236084 containerd[1710]: time="2026-01-15T01:59:46.235290177Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 15 01:59:46.236084 containerd[1710]: time="2026-01-15T01:59:46.235300146Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 15 01:59:46.236084 containerd[1710]: time="2026-01-15T01:59:46.235356840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 15 01:59:46.236084 containerd[1710]: time="2026-01-15T01:59:46.235496666Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 15 01:59:46.236084 containerd[1710]: time="2026-01-15T01:59:46.235522481Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 15 01:59:46.236084 containerd[1710]: time="2026-01-15T01:59:46.235536340Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 15 01:59:46.236084 containerd[1710]: time="2026-01-15T01:59:46.235621030Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 15 01:59:46.235566 systemd[1]: Reached target getty.target - Login Prompts. Jan 15 01:59:46.236326 containerd[1710]: time="2026-01-15T01:59:46.235800911Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 15 01:59:46.236326 containerd[1710]: time="2026-01-15T01:59:46.235846629Z" level=info msg="metadata content store policy set" policy=shared Jan 15 01:59:46.238123 systemd[1]: Starting sshkeys.service... Jan 15 01:59:46.253769 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 15 01:59:46.257217 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 15 01:59:46.275087 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:59:46.295354 containerd[1710]: time="2026-01-15T01:59:46.295325707Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 15 01:59:46.295510 containerd[1710]: time="2026-01-15T01:59:46.295455591Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 15 01:59:46.295711 containerd[1710]: time="2026-01-15T01:59:46.295684242Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 15 01:59:46.295816 containerd[1710]: time="2026-01-15T01:59:46.295784792Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 15 01:59:46.295816 containerd[1710]: time="2026-01-15T01:59:46.295801363Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 15 01:59:46.296027 containerd[1710]: time="2026-01-15T01:59:46.295992012Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 15 01:59:46.296094 containerd[1710]: time="2026-01-15T01:59:46.296036449Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 15 01:59:46.296094 containerd[1710]: time="2026-01-15T01:59:46.296049513Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 15 01:59:46.296094 containerd[1710]: time="2026-01-15T01:59:46.296076294Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 15 01:59:46.296094 containerd[1710]: time="2026-01-15T01:59:46.296088708Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 15 01:59:46.296204 containerd[1710]: time="2026-01-15T01:59:46.296102380Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 15 01:59:46.296204 containerd[1710]: time="2026-01-15T01:59:46.296113305Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 15 01:59:46.296204 containerd[1710]: time="2026-01-15T01:59:46.296122762Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 15 01:59:46.296204 containerd[1710]: time="2026-01-15T01:59:46.296135805Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 15 01:59:46.296283 containerd[1710]: time="2026-01-15T01:59:46.296240562Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 15 01:59:46.296283 containerd[1710]: time="2026-01-15T01:59:46.296258060Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 15 01:59:46.296283 containerd[1710]: time="2026-01-15T01:59:46.296271633Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 15 01:59:46.296354 containerd[1710]: time="2026-01-15T01:59:46.296286405Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 15 01:59:46.296354 containerd[1710]: time="2026-01-15T01:59:46.296297255Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 15 01:59:46.296354 containerd[1710]: time="2026-01-15T01:59:46.296305946Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 15 01:59:46.296354 containerd[1710]: time="2026-01-15T01:59:46.296318047Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 15 01:59:46.296354 containerd[1710]: time="2026-01-15T01:59:46.296327580Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 15 01:59:46.296354 containerd[1710]: time="2026-01-15T01:59:46.296337426Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 15 01:59:46.296354 containerd[1710]: time="2026-01-15T01:59:46.296348836Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 15 01:59:46.296647 containerd[1710]: time="2026-01-15T01:59:46.296358635Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 15 01:59:46.296647 containerd[1710]: time="2026-01-15T01:59:46.296379683Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 15 01:59:46.296647 containerd[1710]: time="2026-01-15T01:59:46.296420290Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 15 01:59:46.296647 containerd[1710]: time="2026-01-15T01:59:46.296432873Z" level=info msg="Start snapshots syncer" Jan 15 01:59:46.296647 containerd[1710]: time="2026-01-15T01:59:46.296455918Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 15 01:59:46.296739 containerd[1710]: time="2026-01-15T01:59:46.296685620Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 15 01:59:46.296739 containerd[1710]: time="2026-01-15T01:59:46.296731326Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 15 01:59:46.296862 containerd[1710]: time="2026-01-15T01:59:46.296794285Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 15 01:59:46.296942 containerd[1710]: time="2026-01-15T01:59:46.296876215Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 15 01:59:46.296942 containerd[1710]: time="2026-01-15T01:59:46.296902678Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 15 01:59:46.296942 containerd[1710]: time="2026-01-15T01:59:46.296913406Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 15 01:59:46.296942 containerd[1710]: time="2026-01-15T01:59:46.296923427Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 15 01:59:46.296942 containerd[1710]: time="2026-01-15T01:59:46.296933974Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 15 01:59:46.296942 containerd[1710]: time="2026-01-15T01:59:46.296943162Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 15 01:59:46.297134 containerd[1710]: time="2026-01-15T01:59:46.296954148Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 15 01:59:46.297134 containerd[1710]: time="2026-01-15T01:59:46.296965126Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 15 01:59:46.297134 containerd[1710]: time="2026-01-15T01:59:46.296974832Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 15 01:59:46.297134 containerd[1710]: time="2026-01-15T01:59:46.297023423Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 15 01:59:46.297134 containerd[1710]: time="2026-01-15T01:59:46.297038159Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 15 01:59:46.297134 containerd[1710]: time="2026-01-15T01:59:46.297046233Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 15 01:59:46.297134 containerd[1710]: time="2026-01-15T01:59:46.297072746Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 15 01:59:46.297134 containerd[1710]: time="2026-01-15T01:59:46.297080801Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 15 01:59:46.297134 containerd[1710]: time="2026-01-15T01:59:46.297089354Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 15 01:59:46.297134 containerd[1710]: time="2026-01-15T01:59:46.297098597Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 15 01:59:46.297134 containerd[1710]: time="2026-01-15T01:59:46.297114395Z" level=info msg="runtime interface created" Jan 15 01:59:46.297134 containerd[1710]: time="2026-01-15T01:59:46.297119937Z" level=info msg="created NRI interface" Jan 15 01:59:46.297134 containerd[1710]: time="2026-01-15T01:59:46.297127133Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 15 01:59:46.297134 containerd[1710]: time="2026-01-15T01:59:46.297136976Z" level=info msg="Connect containerd service" Jan 15 01:59:46.297536 containerd[1710]: time="2026-01-15T01:59:46.297154491Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 15 01:59:46.297711 containerd[1710]: time="2026-01-15T01:59:46.297689785Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 15 01:59:46.413118 containerd[1710]: time="2026-01-15T01:59:46.412762922Z" level=info msg="Start subscribing containerd event" Jan 15 01:59:46.413118 containerd[1710]: time="2026-01-15T01:59:46.412828593Z" level=info msg="Start recovering state" Jan 15 01:59:46.413118 containerd[1710]: time="2026-01-15T01:59:46.412922026Z" level=info msg="Start event monitor" Jan 15 01:59:46.413118 containerd[1710]: time="2026-01-15T01:59:46.412932366Z" level=info msg="Start cni network conf syncer for default" Jan 15 01:59:46.413118 containerd[1710]: time="2026-01-15T01:59:46.412939500Z" level=info msg="Start streaming server" Jan 15 01:59:46.413118 containerd[1710]: time="2026-01-15T01:59:46.412947917Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 15 01:59:46.413118 containerd[1710]: time="2026-01-15T01:59:46.412955110Z" level=info msg="runtime interface starting up..." Jan 15 01:59:46.413118 containerd[1710]: time="2026-01-15T01:59:46.412960492Z" level=info msg="starting plugins..." Jan 15 01:59:46.413118 containerd[1710]: time="2026-01-15T01:59:46.412970933Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 15 01:59:46.413718 containerd[1710]: time="2026-01-15T01:59:46.413645509Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 15 01:59:46.413718 containerd[1710]: time="2026-01-15T01:59:46.413692403Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 15 01:59:46.415235 containerd[1710]: time="2026-01-15T01:59:46.415111782Z" level=info msg="containerd successfully booted in 0.202269s" Jan 15 01:59:46.415329 systemd[1]: Started containerd.service - containerd container runtime. Jan 15 01:59:46.491080 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Jan 15 01:59:46.526166 systemd-networkd[1600]: eth0: Gained IPv6LL Jan 15 01:59:46.528904 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 15 01:59:46.530689 systemd[1]: Reached target network-online.target - Network is Online. Jan 15 01:59:46.534810 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 01:59:46.538210 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 15 01:59:46.558001 extend-filesystems[1707]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 15 01:59:46.558001 extend-filesystems[1707]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 15 01:59:46.558001 extend-filesystems[1707]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Jan 15 01:59:46.562243 extend-filesystems[1676]: Resized filesystem in /dev/vda9 Jan 15 01:59:46.561804 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 15 01:59:46.564231 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 15 01:59:46.595321 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 15 01:59:46.648192 tar[1701]: linux-amd64/README.md Jan 15 01:59:46.664993 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 15 01:59:47.041346 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:59:47.288217 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:59:48.172745 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 01:59:48.186662 (kubelet)[1814]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 01:59:49.069512 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:59:49.312125 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:59:49.513410 kubelet[1814]: E0115 01:59:49.513218 1814 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 01:59:49.520155 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 01:59:49.520531 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 01:59:49.521463 systemd[1]: kubelet.service: Consumed 1.820s CPU time, 266.7M memory peak. Jan 15 01:59:51.246860 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 15 01:59:51.255828 systemd[1]: Started sshd@0-10.0.1.164:22-4.153.228.146:45670.service - OpenSSH per-connection server daemon (4.153.228.146:45670). Jan 15 01:59:51.914229 sshd[1824]: Accepted publickey for core from 4.153.228.146 port 45670 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 01:59:51.917754 sshd-session[1824]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 01:59:51.943346 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 15 01:59:51.946679 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 15 01:59:51.963606 systemd-logind[1687]: New session 1 of user core. Jan 15 01:59:51.994128 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 15 01:59:52.000922 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 15 01:59:52.025668 (systemd)[1833]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 15 01:59:52.032450 systemd-logind[1687]: New session c1 of user core. Jan 15 01:59:52.235959 systemd[1833]: Queued start job for default target default.target. Jan 15 01:59:52.243830 systemd[1833]: Created slice app.slice - User Application Slice. Jan 15 01:59:52.244205 systemd[1833]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 15 01:59:52.244272 systemd[1833]: Reached target paths.target - Paths. Jan 15 01:59:52.244361 systemd[1833]: Reached target timers.target - Timers. Jan 15 01:59:52.245550 systemd[1833]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 15 01:59:52.250193 systemd[1833]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 15 01:59:52.264132 systemd[1833]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 15 01:59:52.274425 systemd[1833]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 15 01:59:52.274520 systemd[1833]: Reached target sockets.target - Sockets. Jan 15 01:59:52.274552 systemd[1833]: Reached target basic.target - Basic System. Jan 15 01:59:52.274583 systemd[1833]: Reached target default.target - Main User Target. Jan 15 01:59:52.274608 systemd[1833]: Startup finished in 223ms. Jan 15 01:59:52.275136 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 15 01:59:52.286594 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 15 01:59:52.608685 systemd[1]: Started sshd@1-10.0.1.164:22-4.153.228.146:45684.service - OpenSSH per-connection server daemon (4.153.228.146:45684). Jan 15 01:59:53.100115 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:59:53.121178 coreos-metadata[1672]: Jan 15 01:59:53.120 WARN failed to locate config-drive, using the metadata service API instead Jan 15 01:59:53.168387 coreos-metadata[1672]: Jan 15 01:59:53.168 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 15 01:59:53.182143 sshd[1846]: Accepted publickey for core from 4.153.228.146 port 45684 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 01:59:53.184966 sshd-session[1846]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 01:59:53.196684 systemd-logind[1687]: New session 2 of user core. Jan 15 01:59:53.212024 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 15 01:59:53.351294 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 15 01:59:53.371613 coreos-metadata[1775]: Jan 15 01:59:53.371 WARN failed to locate config-drive, using the metadata service API instead Jan 15 01:59:53.410428 coreos-metadata[1775]: Jan 15 01:59:53.410 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 15 01:59:53.488145 sshd[1851]: Connection closed by 4.153.228.146 port 45684 Jan 15 01:59:53.489292 sshd-session[1846]: pam_unix(sshd:session): session closed for user core Jan 15 01:59:53.500380 systemd[1]: sshd@1-10.0.1.164:22-4.153.228.146:45684.service: Deactivated successfully. Jan 15 01:59:53.505903 systemd[1]: session-2.scope: Deactivated successfully. Jan 15 01:59:53.508286 systemd-logind[1687]: Session 2 logged out. Waiting for processes to exit. Jan 15 01:59:53.511672 systemd-logind[1687]: Removed session 2. Jan 15 01:59:53.607905 systemd[1]: Started sshd@2-10.0.1.164:22-4.153.228.146:34180.service - OpenSSH per-connection server daemon (4.153.228.146:34180). Jan 15 01:59:53.724464 coreos-metadata[1672]: Jan 15 01:59:53.724 INFO Fetch successful Jan 15 01:59:53.724784 coreos-metadata[1672]: Jan 15 01:59:53.724 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 15 01:59:53.863750 coreos-metadata[1672]: Jan 15 01:59:53.863 INFO Fetch successful Jan 15 01:59:53.864006 coreos-metadata[1672]: Jan 15 01:59:53.863 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 15 01:59:53.945502 coreos-metadata[1775]: Jan 15 01:59:53.945 INFO Fetch successful Jan 15 01:59:53.945757 coreos-metadata[1775]: Jan 15 01:59:53.945 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 15 01:59:54.042180 coreos-metadata[1672]: Jan 15 01:59:54.041 INFO Fetch successful Jan 15 01:59:54.042180 coreos-metadata[1672]: Jan 15 01:59:54.041 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 15 01:59:54.188718 sshd[1859]: Accepted publickey for core from 4.153.228.146 port 34180 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 01:59:54.191679 sshd-session[1859]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 01:59:54.203783 systemd-logind[1687]: New session 3 of user core. Jan 15 01:59:54.220667 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 15 01:59:54.493794 sshd[1862]: Connection closed by 4.153.228.146 port 34180 Jan 15 01:59:54.494822 sshd-session[1859]: pam_unix(sshd:session): session closed for user core Jan 15 01:59:54.505481 systemd[1]: sshd@2-10.0.1.164:22-4.153.228.146:34180.service: Deactivated successfully. Jan 15 01:59:54.510281 systemd[1]: session-3.scope: Deactivated successfully. Jan 15 01:59:54.512476 systemd-logind[1687]: Session 3 logged out. Waiting for processes to exit. Jan 15 01:59:54.515810 systemd-logind[1687]: Removed session 3. Jan 15 01:59:54.782752 coreos-metadata[1775]: Jan 15 01:59:54.782 INFO Fetch successful Jan 15 01:59:54.787281 unknown[1775]: wrote ssh authorized keys file for user: core Jan 15 01:59:54.790165 coreos-metadata[1672]: Jan 15 01:59:54.789 INFO Fetch successful Jan 15 01:59:54.790165 coreos-metadata[1672]: Jan 15 01:59:54.789 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 15 01:59:54.833231 update-ssh-keys[1868]: Updated "/home/core/.ssh/authorized_keys" Jan 15 01:59:54.836361 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 15 01:59:54.839838 systemd[1]: Finished sshkeys.service. Jan 15 01:59:54.986546 coreos-metadata[1672]: Jan 15 01:59:54.986 INFO Fetch successful Jan 15 01:59:54.986546 coreos-metadata[1672]: Jan 15 01:59:54.986 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 15 01:59:55.089328 coreos-metadata[1672]: Jan 15 01:59:55.089 INFO Fetch successful Jan 15 01:59:55.176234 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 15 01:59:55.177651 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 15 01:59:55.178035 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 15 01:59:55.178650 systemd[1]: Startup finished in 4.292s (kernel) + 14.388s (initrd) + 13.673s (userspace) = 32.355s. Jan 15 01:59:59.791461 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 15 01:59:59.799465 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 02:00:00.080825 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 02:00:00.092388 (kubelet)[1884]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 02:00:00.159126 kubelet[1884]: E0115 02:00:00.158749 1884 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 02:00:00.170347 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 02:00:00.170721 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 02:00:00.172220 systemd[1]: kubelet.service: Consumed 290ms CPU time, 110M memory peak. Jan 15 02:00:04.623728 systemd[1]: Started sshd@3-10.0.1.164:22-4.153.228.146:36182.service - OpenSSH per-connection server daemon (4.153.228.146:36182). Jan 15 02:00:05.218130 sshd[1892]: Accepted publickey for core from 4.153.228.146 port 36182 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:00:05.220216 sshd-session[1892]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:00:05.235160 systemd-logind[1687]: New session 4 of user core. Jan 15 02:00:05.246446 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 15 02:00:05.526222 sshd[1895]: Connection closed by 4.153.228.146 port 36182 Jan 15 02:00:05.526512 sshd-session[1892]: pam_unix(sshd:session): session closed for user core Jan 15 02:00:05.535943 systemd[1]: sshd@3-10.0.1.164:22-4.153.228.146:36182.service: Deactivated successfully. Jan 15 02:00:05.540021 systemd[1]: session-4.scope: Deactivated successfully. Jan 15 02:00:05.544787 systemd-logind[1687]: Session 4 logged out. Waiting for processes to exit. Jan 15 02:00:05.546970 systemd-logind[1687]: Removed session 4. Jan 15 02:00:05.643542 systemd[1]: Started sshd@4-10.0.1.164:22-4.153.228.146:36184.service - OpenSSH per-connection server daemon (4.153.228.146:36184). Jan 15 02:00:06.232998 sshd[1901]: Accepted publickey for core from 4.153.228.146 port 36184 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:00:06.235918 sshd-session[1901]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:00:06.249554 systemd-logind[1687]: New session 5 of user core. Jan 15 02:00:06.262460 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 15 02:00:06.531839 sshd[1904]: Connection closed by 4.153.228.146 port 36184 Jan 15 02:00:06.532871 sshd-session[1901]: pam_unix(sshd:session): session closed for user core Jan 15 02:00:06.542711 systemd-logind[1687]: Session 5 logged out. Waiting for processes to exit. Jan 15 02:00:06.543356 systemd[1]: sshd@4-10.0.1.164:22-4.153.228.146:36184.service: Deactivated successfully. Jan 15 02:00:06.547473 systemd[1]: session-5.scope: Deactivated successfully. Jan 15 02:00:06.551812 systemd-logind[1687]: Removed session 5. Jan 15 02:00:06.654398 systemd[1]: Started sshd@5-10.0.1.164:22-4.153.228.146:36196.service - OpenSSH per-connection server daemon (4.153.228.146:36196). Jan 15 02:00:07.229868 sshd[1910]: Accepted publickey for core from 4.153.228.146 port 36196 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:00:07.232698 sshd-session[1910]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:00:07.243651 systemd-logind[1687]: New session 6 of user core. Jan 15 02:00:07.265449 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 15 02:00:07.535466 sshd[1913]: Connection closed by 4.153.228.146 port 36196 Jan 15 02:00:07.536512 sshd-session[1910]: pam_unix(sshd:session): session closed for user core Jan 15 02:00:07.544180 systemd[1]: sshd@5-10.0.1.164:22-4.153.228.146:36196.service: Deactivated successfully. Jan 15 02:00:07.548695 systemd[1]: session-6.scope: Deactivated successfully. Jan 15 02:00:07.553538 systemd-logind[1687]: Session 6 logged out. Waiting for processes to exit. Jan 15 02:00:07.555432 systemd-logind[1687]: Removed session 6. Jan 15 02:00:07.651209 systemd[1]: Started sshd@6-10.0.1.164:22-4.153.228.146:36200.service - OpenSSH per-connection server daemon (4.153.228.146:36200). Jan 15 02:00:08.231540 sshd[1919]: Accepted publickey for core from 4.153.228.146 port 36200 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:00:08.234084 sshd-session[1919]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:00:08.246166 systemd-logind[1687]: New session 7 of user core. Jan 15 02:00:08.253404 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 15 02:00:08.482190 sudo[1923]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 15 02:00:08.482884 sudo[1923]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 02:00:08.508782 sudo[1923]: pam_unix(sudo:session): session closed for user root Jan 15 02:00:08.609094 sshd[1922]: Connection closed by 4.153.228.146 port 36200 Jan 15 02:00:08.608134 sshd-session[1919]: pam_unix(sshd:session): session closed for user core Jan 15 02:00:08.616494 systemd[1]: sshd@6-10.0.1.164:22-4.153.228.146:36200.service: Deactivated successfully. Jan 15 02:00:08.620554 systemd[1]: session-7.scope: Deactivated successfully. Jan 15 02:00:08.626033 systemd-logind[1687]: Session 7 logged out. Waiting for processes to exit. Jan 15 02:00:08.628027 systemd-logind[1687]: Removed session 7. Jan 15 02:00:08.735336 systemd[1]: Started sshd@7-10.0.1.164:22-4.153.228.146:36210.service - OpenSSH per-connection server daemon (4.153.228.146:36210). Jan 15 02:00:09.320578 sshd[1929]: Accepted publickey for core from 4.153.228.146 port 36210 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:00:09.323328 sshd-session[1929]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:00:09.336162 systemd-logind[1687]: New session 8 of user core. Jan 15 02:00:09.343446 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 15 02:00:09.535712 sudo[1934]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 15 02:00:09.536571 sudo[1934]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 02:00:09.547505 sudo[1934]: pam_unix(sudo:session): session closed for user root Jan 15 02:00:09.564599 sudo[1933]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 15 02:00:09.565860 sudo[1933]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 02:00:09.591799 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 15 02:00:09.675040 chronyd[1670]: Selected source PHC0 Jan 15 02:00:09.675158 chronyd[1670]: System clock wrong by 1.126094 seconds Jan 15 02:00:10.802657 systemd-resolved[1377]: Clock change detected. Flushing caches. Jan 15 02:00:10.802756 chronyd[1670]: System clock was stepped by 1.126094 seconds Jan 15 02:00:10.820000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 15 02:00:10.823821 kernel: kauditd_printk_skb: 40 callbacks suppressed Jan 15 02:00:10.823941 kernel: audit: type=1305 audit(1768442410.820:236): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 15 02:00:10.820000 audit[1956]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffc2c742cb0 a2=420 a3=0 items=0 ppid=1937 pid=1956 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:10.838008 augenrules[1956]: No rules Jan 15 02:00:10.820000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 15 02:00:10.842270 kernel: audit: type=1300 audit(1768442410.820:236): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffc2c742cb0 a2=420 a3=0 items=0 ppid=1937 pid=1956 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:10.842355 kernel: audit: type=1327 audit(1768442410.820:236): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 15 02:00:10.842959 systemd[1]: audit-rules.service: Deactivated successfully. Jan 15 02:00:10.843516 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 15 02:00:10.845133 sudo[1933]: pam_unix(sudo:session): session closed for user root Jan 15 02:00:10.842000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:10.842000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:10.873131 kernel: audit: type=1130 audit(1768442410.842:237): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:10.873233 kernel: audit: type=1131 audit(1768442410.842:238): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:10.844000 audit[1933]: USER_END pid=1933 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 02:00:10.844000 audit[1933]: CRED_DISP pid=1933 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 02:00:10.888509 kernel: audit: type=1106 audit(1768442410.844:239): pid=1933 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 02:00:10.888593 kernel: audit: type=1104 audit(1768442410.844:240): pid=1933 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 02:00:10.947236 sshd[1932]: Connection closed by 4.153.228.146 port 36210 Jan 15 02:00:10.948099 sshd-session[1929]: pam_unix(sshd:session): session closed for user core Jan 15 02:00:10.962316 kernel: audit: type=1106 audit(1768442410.951:241): pid=1929 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:00:10.951000 audit[1929]: USER_END pid=1929 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:00:10.957589 systemd-logind[1687]: Session 8 logged out. Waiting for processes to exit. Jan 15 02:00:10.959662 systemd[1]: sshd@7-10.0.1.164:22-4.153.228.146:36210.service: Deactivated successfully. Jan 15 02:00:10.951000 audit[1929]: CRED_DISP pid=1929 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:00:10.965629 systemd[1]: session-8.scope: Deactivated successfully. Jan 15 02:00:10.972388 kernel: audit: type=1104 audit(1768442410.951:242): pid=1929 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:00:10.959000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.1.164:22-4.153.228.146:36210 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:10.974245 systemd-logind[1687]: Removed session 8. Jan 15 02:00:10.981210 kernel: audit: type=1131 audit(1768442410.959:243): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.1.164:22-4.153.228.146:36210 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:11.066000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.1.164:22-4.153.228.146:36224 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:11.067387 systemd[1]: Started sshd@8-10.0.1.164:22-4.153.228.146:36224.service - OpenSSH per-connection server daemon (4.153.228.146:36224). Jan 15 02:00:11.535502 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 15 02:00:11.541076 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 02:00:11.643000 audit[1965]: USER_ACCT pid=1965 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:00:11.645476 sshd[1965]: Accepted publickey for core from 4.153.228.146 port 36224 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:00:11.644000 audit[1965]: CRED_ACQ pid=1965 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:00:11.644000 audit[1965]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffea502ec00 a2=3 a3=0 items=0 ppid=1 pid=1965 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:11.644000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:00:11.646963 sshd-session[1965]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:00:11.661628 systemd-logind[1687]: New session 9 of user core. Jan 15 02:00:11.668556 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 15 02:00:11.676000 audit[1965]: USER_START pid=1965 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:00:11.680000 audit[1971]: CRED_ACQ pid=1971 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:00:11.739069 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 02:00:11.738000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:11.748480 (kubelet)[1977]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 02:00:11.788737 kubelet[1977]: E0115 02:00:11.788317 1977 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 02:00:11.792621 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 02:00:11.792969 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 02:00:11.793781 systemd[1]: kubelet.service: Consumed 202ms CPU time, 110.3M memory peak. Jan 15 02:00:11.792000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 15 02:00:11.856000 audit[1984]: USER_ACCT pid=1984 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 02:00:11.857550 sudo[1984]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 15 02:00:11.856000 audit[1984]: CRED_REFR pid=1984 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 02:00:11.858391 sudo[1984]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 15 02:00:11.860000 audit[1984]: USER_START pid=1984 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 02:00:12.746134 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 15 02:00:12.757520 (dockerd)[2003]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 15 02:00:13.388354 dockerd[2003]: time="2026-01-15T02:00:13.388243196Z" level=info msg="Starting up" Jan 15 02:00:13.390026 dockerd[2003]: time="2026-01-15T02:00:13.389894099Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 15 02:00:13.417552 dockerd[2003]: time="2026-01-15T02:00:13.417403067Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 15 02:00:13.497575 dockerd[2003]: time="2026-01-15T02:00:13.497501093Z" level=info msg="Loading containers: start." Jan 15 02:00:13.530209 kernel: Initializing XFRM netlink socket Jan 15 02:00:13.655000 audit[2051]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2051 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.655000 audit[2051]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffd4bb26930 a2=0 a3=0 items=0 ppid=2003 pid=2051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.655000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 15 02:00:13.659000 audit[2053]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2053 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.659000 audit[2053]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffeaaba7fb0 a2=0 a3=0 items=0 ppid=2003 pid=2053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.659000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 15 02:00:13.663000 audit[2055]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2055 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.663000 audit[2055]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffef9c31670 a2=0 a3=0 items=0 ppid=2003 pid=2055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.663000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 15 02:00:13.667000 audit[2057]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2057 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.667000 audit[2057]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd248dd370 a2=0 a3=0 items=0 ppid=2003 pid=2057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.667000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 15 02:00:13.671000 audit[2059]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2059 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.671000 audit[2059]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe5cacee00 a2=0 a3=0 items=0 ppid=2003 pid=2059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.671000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 15 02:00:13.675000 audit[2061]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2061 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.675000 audit[2061]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffe61094520 a2=0 a3=0 items=0 ppid=2003 pid=2061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.675000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 15 02:00:13.678000 audit[2063]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2063 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.678000 audit[2063]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffcee8f0070 a2=0 a3=0 items=0 ppid=2003 pid=2063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.678000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 15 02:00:13.683000 audit[2065]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2065 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.683000 audit[2065]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffff608b8b0 a2=0 a3=0 items=0 ppid=2003 pid=2065 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.683000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 15 02:00:13.752000 audit[2068]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2068 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.752000 audit[2068]: SYSCALL arch=c000003e syscall=46 success=yes exit=472 a0=3 a1=7fff5e806450 a2=0 a3=0 items=0 ppid=2003 pid=2068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.752000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 15 02:00:13.758000 audit[2070]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2070 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.758000 audit[2070]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffc995211b0 a2=0 a3=0 items=0 ppid=2003 pid=2070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.758000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 15 02:00:13.763000 audit[2072]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2072 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.763000 audit[2072]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7ffd9d409c40 a2=0 a3=0 items=0 ppid=2003 pid=2072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.763000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 15 02:00:13.768000 audit[2074]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2074 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.768000 audit[2074]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffd3e5a8250 a2=0 a3=0 items=0 ppid=2003 pid=2074 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.768000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 15 02:00:13.774000 audit[2076]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2076 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.774000 audit[2076]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffe311680e0 a2=0 a3=0 items=0 ppid=2003 pid=2076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.774000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 15 02:00:13.860000 audit[2106]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2106 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:13.860000 audit[2106]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffd988fb9d0 a2=0 a3=0 items=0 ppid=2003 pid=2106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.860000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 15 02:00:13.864000 audit[2108]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2108 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:13.864000 audit[2108]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffcaa90f560 a2=0 a3=0 items=0 ppid=2003 pid=2108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.864000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 15 02:00:13.869000 audit[2110]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2110 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:13.869000 audit[2110]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdcf5b3230 a2=0 a3=0 items=0 ppid=2003 pid=2110 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.869000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 15 02:00:13.873000 audit[2112]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2112 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:13.873000 audit[2112]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc172b6d60 a2=0 a3=0 items=0 ppid=2003 pid=2112 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.873000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 15 02:00:13.876000 audit[2114]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2114 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:13.876000 audit[2114]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffeb2b44180 a2=0 a3=0 items=0 ppid=2003 pid=2114 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.876000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 15 02:00:13.881000 audit[2116]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2116 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:13.881000 audit[2116]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffcbe1b9cc0 a2=0 a3=0 items=0 ppid=2003 pid=2116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.881000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 15 02:00:13.885000 audit[2118]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2118 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:13.885000 audit[2118]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fffd78d1080 a2=0 a3=0 items=0 ppid=2003 pid=2118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.885000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 15 02:00:13.890000 audit[2120]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2120 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:13.890000 audit[2120]: SYSCALL arch=c000003e syscall=46 success=yes exit=384 a0=3 a1=7ffc0bd5efa0 a2=0 a3=0 items=0 ppid=2003 pid=2120 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.890000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 15 02:00:13.894000 audit[2122]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2122 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:13.894000 audit[2122]: SYSCALL arch=c000003e syscall=46 success=yes exit=484 a0=3 a1=7ffe9ba48a00 a2=0 a3=0 items=0 ppid=2003 pid=2122 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.894000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 15 02:00:13.898000 audit[2124]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2124 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:13.898000 audit[2124]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffc0bdaf9d0 a2=0 a3=0 items=0 ppid=2003 pid=2124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.898000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 15 02:00:13.902000 audit[2126]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2126 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:13.902000 audit[2126]: SYSCALL arch=c000003e syscall=46 success=yes exit=236 a0=3 a1=7fffe3178eb0 a2=0 a3=0 items=0 ppid=2003 pid=2126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.902000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 15 02:00:13.905000 audit[2128]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2128 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:13.905000 audit[2128]: SYSCALL arch=c000003e syscall=46 success=yes exit=248 a0=3 a1=7ffd6e20fff0 a2=0 a3=0 items=0 ppid=2003 pid=2128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.905000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 15 02:00:13.908000 audit[2130]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2130 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:13.908000 audit[2130]: SYSCALL arch=c000003e syscall=46 success=yes exit=232 a0=3 a1=7ffeb9171af0 a2=0 a3=0 items=0 ppid=2003 pid=2130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.908000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 15 02:00:13.917000 audit[2135]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2135 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.917000 audit[2135]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffc9921c940 a2=0 a3=0 items=0 ppid=2003 pid=2135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.917000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 15 02:00:13.921000 audit[2137]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2137 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.921000 audit[2137]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffffb548a50 a2=0 a3=0 items=0 ppid=2003 pid=2137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.921000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 15 02:00:13.925000 audit[2139]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2139 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.925000 audit[2139]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7fff540e00e0 a2=0 a3=0 items=0 ppid=2003 pid=2139 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.925000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 15 02:00:13.928000 audit[2141]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2141 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:13.928000 audit[2141]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffec31681b0 a2=0 a3=0 items=0 ppid=2003 pid=2141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.928000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 15 02:00:13.932000 audit[2143]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2143 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:13.932000 audit[2143]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7fffea5b55d0 a2=0 a3=0 items=0 ppid=2003 pid=2143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.932000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 15 02:00:13.936000 audit[2145]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2145 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:13.936000 audit[2145]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7fffa0ca3d60 a2=0 a3=0 items=0 ppid=2003 pid=2145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.936000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 15 02:00:13.974000 audit[2150]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2150 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.974000 audit[2150]: SYSCALL arch=c000003e syscall=46 success=yes exit=520 a0=3 a1=7ffc19f53020 a2=0 a3=0 items=0 ppid=2003 pid=2150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.974000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 15 02:00:13.978000 audit[2152]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2152 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.978000 audit[2152]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7fff8156c750 a2=0 a3=0 items=0 ppid=2003 pid=2152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.978000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 15 02:00:13.992000 audit[2160]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2160 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:13.992000 audit[2160]: SYSCALL arch=c000003e syscall=46 success=yes exit=300 a0=3 a1=7ffd39698fc0 a2=0 a3=0 items=0 ppid=2003 pid=2160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:13.992000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 15 02:00:14.011000 audit[2166]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2166 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:14.011000 audit[2166]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7fff4041a5f0 a2=0 a3=0 items=0 ppid=2003 pid=2166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:14.011000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 15 02:00:14.014000 audit[2168]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2168 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:14.014000 audit[2168]: SYSCALL arch=c000003e syscall=46 success=yes exit=512 a0=3 a1=7ffc41adf1d0 a2=0 a3=0 items=0 ppid=2003 pid=2168 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:14.014000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 15 02:00:14.018000 audit[2170]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2170 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:14.018000 audit[2170]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffe9d771130 a2=0 a3=0 items=0 ppid=2003 pid=2170 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:14.018000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 15 02:00:14.021000 audit[2172]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2172 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:14.021000 audit[2172]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7ffd374cb300 a2=0 a3=0 items=0 ppid=2003 pid=2172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:14.021000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 15 02:00:14.025000 audit[2174]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2174 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:14.025000 audit[2174]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffca5344f70 a2=0 a3=0 items=0 ppid=2003 pid=2174 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:14.025000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 15 02:00:14.027050 systemd-networkd[1600]: docker0: Link UP Jan 15 02:00:14.038471 dockerd[2003]: time="2026-01-15T02:00:14.038399455Z" level=info msg="Loading containers: done." Jan 15 02:00:14.063063 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck302031450-merged.mount: Deactivated successfully. Jan 15 02:00:14.082536 dockerd[2003]: time="2026-01-15T02:00:14.082498756Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 15 02:00:14.082804 dockerd[2003]: time="2026-01-15T02:00:14.082586584Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 15 02:00:14.082804 dockerd[2003]: time="2026-01-15T02:00:14.082680056Z" level=info msg="Initializing buildkit" Jan 15 02:00:14.140881 dockerd[2003]: time="2026-01-15T02:00:14.140787975Z" level=info msg="Completed buildkit initialization" Jan 15 02:00:14.162505 dockerd[2003]: time="2026-01-15T02:00:14.161504564Z" level=info msg="Daemon has completed initialization" Jan 15 02:00:14.162505 dockerd[2003]: time="2026-01-15T02:00:14.161651989Z" level=info msg="API listen on /run/docker.sock" Jan 15 02:00:14.162000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:14.163417 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 15 02:00:16.317090 containerd[1710]: time="2026-01-15T02:00:16.316932945Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\"" Jan 15 02:00:17.104750 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2037275533.mount: Deactivated successfully. Jan 15 02:00:18.345334 containerd[1710]: time="2026-01-15T02:00:18.345296468Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:18.348304 containerd[1710]: time="2026-01-15T02:00:18.348279010Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.11: active requests=0, bytes read=27401903" Jan 15 02:00:18.365328 containerd[1710]: time="2026-01-15T02:00:18.365306078Z" level=info msg="ImageCreate event name:\"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:18.371919 containerd[1710]: time="2026-01-15T02:00:18.371884824Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:18.372766 containerd[1710]: time="2026-01-15T02:00:18.372646060Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.11\" with image id \"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\", size \"29067246\" in 2.055644601s" Jan 15 02:00:18.372766 containerd[1710]: time="2026-01-15T02:00:18.372673160Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\" returns image reference \"sha256:7757c58248a29fc7474a8072796848689852b0477adf16765f38b3d1a9bacadf\"" Jan 15 02:00:18.373190 containerd[1710]: time="2026-01-15T02:00:18.373178560Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\"" Jan 15 02:00:19.931310 containerd[1710]: time="2026-01-15T02:00:19.931262926Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:19.933023 containerd[1710]: time="2026-01-15T02:00:19.932861348Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.11: active requests=0, bytes read=24987951" Jan 15 02:00:19.934551 containerd[1710]: time="2026-01-15T02:00:19.934533288Z" level=info msg="ImageCreate event name:\"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:19.937534 containerd[1710]: time="2026-01-15T02:00:19.937513183Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:19.938190 containerd[1710]: time="2026-01-15T02:00:19.938174190Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.11\" with image id \"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\", size \"26650388\" in 1.564947141s" Jan 15 02:00:19.938254 containerd[1710]: time="2026-01-15T02:00:19.938244467Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\" returns image reference \"sha256:0175d0a8243db520e3caa6d5c1e4248fddbc32447a9e8b5f4630831bc1e2489e\"" Jan 15 02:00:19.938928 containerd[1710]: time="2026-01-15T02:00:19.938910948Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\"" Jan 15 02:00:21.533935 containerd[1710]: time="2026-01-15T02:00:21.533287402Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:21.535712 containerd[1710]: time="2026-01-15T02:00:21.535694217Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.11: active requests=0, bytes read=19396939" Jan 15 02:00:21.537411 containerd[1710]: time="2026-01-15T02:00:21.537398415Z" level=info msg="ImageCreate event name:\"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:21.540345 containerd[1710]: time="2026-01-15T02:00:21.540325995Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:21.540840 containerd[1710]: time="2026-01-15T02:00:21.540755106Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.11\" with image id \"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\", size \"21062128\" in 1.601820904s" Jan 15 02:00:21.540911 containerd[1710]: time="2026-01-15T02:00:21.540901175Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\" returns image reference \"sha256:23d6a1fb92fda53b787f364351c610e55f073e8bdf0de5831974df7875b13f21\"" Jan 15 02:00:21.541294 containerd[1710]: time="2026-01-15T02:00:21.541275560Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\"" Jan 15 02:00:21.807637 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 15 02:00:21.812550 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 02:00:22.030417 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 02:00:22.029000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:22.032328 kernel: kauditd_printk_skb: 134 callbacks suppressed Jan 15 02:00:22.032392 kernel: audit: type=1130 audit(1768442422.029:296): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:22.052518 (kubelet)[2287]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 02:00:22.494700 kubelet[2287]: E0115 02:00:22.494612 2287 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 02:00:22.499669 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 02:00:22.500559 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 02:00:22.500000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 15 02:00:22.505220 systemd[1]: kubelet.service: Consumed 283ms CPU time, 108.9M memory peak. Jan 15 02:00:22.514198 kernel: audit: type=1131 audit(1768442422.500:297): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 15 02:00:23.094117 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount631823450.mount: Deactivated successfully. Jan 15 02:00:23.500709 containerd[1710]: time="2026-01-15T02:00:23.500603502Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:23.503132 containerd[1710]: time="2026-01-15T02:00:23.503107303Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.11: active requests=0, bytes read=19572392" Jan 15 02:00:23.505455 containerd[1710]: time="2026-01-15T02:00:23.505430905Z" level=info msg="ImageCreate event name:\"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:23.520489 containerd[1710]: time="2026-01-15T02:00:23.520455127Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:23.520913 containerd[1710]: time="2026-01-15T02:00:23.520895033Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.11\" with image id \"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\", repo tag \"registry.k8s.io/kube-proxy:v1.32.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\", size \"31160918\" in 1.979593811s" Jan 15 02:00:23.520956 containerd[1710]: time="2026-01-15T02:00:23.520918027Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\" returns image reference \"sha256:4d8fb2dc5751966f058943ff7c5f10551e603d726ab8648c7c7b7f95a2663e3d\"" Jan 15 02:00:23.521633 containerd[1710]: time="2026-01-15T02:00:23.521558754Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jan 15 02:00:24.248524 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3671002615.mount: Deactivated successfully. Jan 15 02:00:25.777288 containerd[1710]: time="2026-01-15T02:00:25.777237250Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:25.779080 containerd[1710]: time="2026-01-15T02:00:25.778887402Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=17815675" Jan 15 02:00:25.780596 containerd[1710]: time="2026-01-15T02:00:25.780577791Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:25.783497 containerd[1710]: time="2026-01-15T02:00:25.783474438Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:25.784325 containerd[1710]: time="2026-01-15T02:00:25.784304609Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 2.262719777s" Jan 15 02:00:25.784371 containerd[1710]: time="2026-01-15T02:00:25.784329836Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Jan 15 02:00:25.784712 containerd[1710]: time="2026-01-15T02:00:25.784682730Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 15 02:00:26.428751 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3740031173.mount: Deactivated successfully. Jan 15 02:00:26.439779 containerd[1710]: time="2026-01-15T02:00:26.439694622Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 15 02:00:26.442853 containerd[1710]: time="2026-01-15T02:00:26.442221959Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 15 02:00:26.444764 containerd[1710]: time="2026-01-15T02:00:26.444714887Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 15 02:00:26.451386 containerd[1710]: time="2026-01-15T02:00:26.451325538Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 15 02:00:26.452709 containerd[1710]: time="2026-01-15T02:00:26.452652688Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 667.937362ms" Jan 15 02:00:26.452825 containerd[1710]: time="2026-01-15T02:00:26.452713861Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jan 15 02:00:26.454292 containerd[1710]: time="2026-01-15T02:00:26.453371968Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Jan 15 02:00:27.147365 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3919804849.mount: Deactivated successfully. Jan 15 02:00:30.187701 containerd[1710]: time="2026-01-15T02:00:30.187620734Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:30.189274 containerd[1710]: time="2026-01-15T02:00:30.189249550Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=55728979" Jan 15 02:00:30.191371 containerd[1710]: time="2026-01-15T02:00:30.191334006Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:30.196171 containerd[1710]: time="2026-01-15T02:00:30.195480528Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:00:30.196268 containerd[1710]: time="2026-01-15T02:00:30.196249636Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 3.742832418s" Jan 15 02:00:30.196323 containerd[1710]: time="2026-01-15T02:00:30.196313882Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Jan 15 02:00:31.967854 update_engine[1688]: I20260115 02:00:31.967743 1688 update_attempter.cc:509] Updating boot flags... Jan 15 02:00:32.556823 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 15 02:00:32.559616 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 02:00:32.727000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:32.728954 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 02:00:32.733610 kernel: audit: type=1130 audit(1768442432.727:298): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:32.738463 (kubelet)[2459]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 15 02:00:33.055232 kubelet[2459]: E0115 02:00:33.054986 2459 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 15 02:00:33.058217 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 15 02:00:33.058340 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 15 02:00:33.058692 systemd[1]: kubelet.service: Consumed 176ms CPU time, 110.2M memory peak. Jan 15 02:00:33.057000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 15 02:00:33.062167 kernel: audit: type=1131 audit(1768442433.057:299): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 15 02:00:33.764225 kernel: audit: type=1130 audit(1768442433.753:300): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:33.764394 kernel: audit: type=1131 audit(1768442433.753:301): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:33.753000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:33.753000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:33.754343 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 02:00:33.754495 systemd[1]: kubelet.service: Consumed 176ms CPU time, 110.2M memory peak. Jan 15 02:00:33.758345 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 02:00:33.797193 systemd[1]: Reload requested from client PID 2474 ('systemctl') (unit session-9.scope)... Jan 15 02:00:33.797203 systemd[1]: Reloading... Jan 15 02:00:33.892184 zram_generator::config[2515]: No configuration found. Jan 15 02:00:34.112171 systemd[1]: Reloading finished in 314 ms. Jan 15 02:00:34.146168 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 15 02:00:34.146384 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 15 02:00:34.146836 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 02:00:34.147001 systemd[1]: kubelet.service: Consumed 119ms CPU time, 92.3M memory peak. Jan 15 02:00:34.150209 kernel: audit: type=1130 audit(1768442434.145:302): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 15 02:00:34.145000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 15 02:00:34.152989 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 02:00:34.153000 audit: BPF prog-id=63 op=LOAD Jan 15 02:00:34.154000 audit: BPF prog-id=58 op=UNLOAD Jan 15 02:00:34.157684 kernel: audit: type=1334 audit(1768442434.153:303): prog-id=63 op=LOAD Jan 15 02:00:34.157749 kernel: audit: type=1334 audit(1768442434.154:304): prog-id=58 op=UNLOAD Jan 15 02:00:34.155000 audit: BPF prog-id=64 op=LOAD Jan 15 02:00:34.155000 audit: BPF prog-id=49 op=UNLOAD Jan 15 02:00:34.161607 kernel: audit: type=1334 audit(1768442434.155:305): prog-id=64 op=LOAD Jan 15 02:00:34.161683 kernel: audit: type=1334 audit(1768442434.155:306): prog-id=49 op=UNLOAD Jan 15 02:00:34.161729 kernel: audit: type=1334 audit(1768442434.159:307): prog-id=65 op=LOAD Jan 15 02:00:34.159000 audit: BPF prog-id=65 op=LOAD Jan 15 02:00:34.166000 audit: BPF prog-id=52 op=UNLOAD Jan 15 02:00:34.166000 audit: BPF prog-id=66 op=LOAD Jan 15 02:00:34.166000 audit: BPF prog-id=67 op=LOAD Jan 15 02:00:34.166000 audit: BPF prog-id=53 op=UNLOAD Jan 15 02:00:34.166000 audit: BPF prog-id=54 op=UNLOAD Jan 15 02:00:34.167000 audit: BPF prog-id=68 op=LOAD Jan 15 02:00:34.168000 audit: BPF prog-id=43 op=UNLOAD Jan 15 02:00:34.168000 audit: BPF prog-id=69 op=LOAD Jan 15 02:00:34.168000 audit: BPF prog-id=70 op=LOAD Jan 15 02:00:34.168000 audit: BPF prog-id=44 op=UNLOAD Jan 15 02:00:34.168000 audit: BPF prog-id=45 op=UNLOAD Jan 15 02:00:34.170000 audit: BPF prog-id=71 op=LOAD Jan 15 02:00:34.171000 audit: BPF prog-id=59 op=UNLOAD Jan 15 02:00:34.171000 audit: BPF prog-id=72 op=LOAD Jan 15 02:00:34.171000 audit: BPF prog-id=73 op=LOAD Jan 15 02:00:34.171000 audit: BPF prog-id=50 op=UNLOAD Jan 15 02:00:34.172000 audit: BPF prog-id=51 op=UNLOAD Jan 15 02:00:34.175000 audit: BPF prog-id=74 op=LOAD Jan 15 02:00:34.175000 audit: BPF prog-id=46 op=UNLOAD Jan 15 02:00:34.175000 audit: BPF prog-id=75 op=LOAD Jan 15 02:00:34.175000 audit: BPF prog-id=76 op=LOAD Jan 15 02:00:34.175000 audit: BPF prog-id=47 op=UNLOAD Jan 15 02:00:34.175000 audit: BPF prog-id=48 op=UNLOAD Jan 15 02:00:34.178000 audit: BPF prog-id=77 op=LOAD Jan 15 02:00:34.178000 audit: BPF prog-id=60 op=UNLOAD Jan 15 02:00:34.178000 audit: BPF prog-id=78 op=LOAD Jan 15 02:00:34.178000 audit: BPF prog-id=79 op=LOAD Jan 15 02:00:34.178000 audit: BPF prog-id=61 op=UNLOAD Jan 15 02:00:34.178000 audit: BPF prog-id=62 op=UNLOAD Jan 15 02:00:34.179000 audit: BPF prog-id=80 op=LOAD Jan 15 02:00:34.179000 audit: BPF prog-id=55 op=UNLOAD Jan 15 02:00:34.179000 audit: BPF prog-id=81 op=LOAD Jan 15 02:00:34.179000 audit: BPF prog-id=82 op=LOAD Jan 15 02:00:34.179000 audit: BPF prog-id=56 op=UNLOAD Jan 15 02:00:34.179000 audit: BPF prog-id=57 op=UNLOAD Jan 15 02:00:37.414628 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 02:00:37.414000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:37.440770 (kubelet)[2570]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 15 02:00:37.520230 kubelet[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 02:00:37.520629 kubelet[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 15 02:00:37.520698 kubelet[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 02:00:37.520958 kubelet[2570]: I0115 02:00:37.520926 2570 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 15 02:00:38.112169 kubelet[2570]: I0115 02:00:38.111515 2570 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 15 02:00:38.112169 kubelet[2570]: I0115 02:00:38.111537 2570 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 15 02:00:38.112169 kubelet[2570]: I0115 02:00:38.111785 2570 server.go:954] "Client rotation is on, will bootstrap in background" Jan 15 02:00:38.151068 kubelet[2570]: E0115 02:00:38.151009 2570 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.1.164:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.1.164:6443: connect: connection refused" logger="UnhandledError" Jan 15 02:00:38.164345 kubelet[2570]: I0115 02:00:38.164328 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 15 02:00:38.195313 kubelet[2570]: I0115 02:00:38.195281 2570 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 15 02:00:38.202692 kubelet[2570]: I0115 02:00:38.202665 2570 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 15 02:00:38.205519 kubelet[2570]: I0115 02:00:38.205440 2570 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 15 02:00:38.206013 kubelet[2570]: I0115 02:00:38.205535 2570 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4515-1-0-n-e5e35ee394","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 15 02:00:38.206117 kubelet[2570]: I0115 02:00:38.206036 2570 topology_manager.go:138] "Creating topology manager with none policy" Jan 15 02:00:38.206117 kubelet[2570]: I0115 02:00:38.206060 2570 container_manager_linux.go:304] "Creating device plugin manager" Jan 15 02:00:38.206490 kubelet[2570]: I0115 02:00:38.206458 2570 state_mem.go:36] "Initialized new in-memory state store" Jan 15 02:00:38.216866 kubelet[2570]: I0115 02:00:38.216836 2570 kubelet.go:446] "Attempting to sync node with API server" Jan 15 02:00:38.216941 kubelet[2570]: I0115 02:00:38.216920 2570 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 15 02:00:38.216998 kubelet[2570]: I0115 02:00:38.216983 2570 kubelet.go:352] "Adding apiserver pod source" Jan 15 02:00:38.217023 kubelet[2570]: I0115 02:00:38.217013 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 15 02:00:38.220238 kubelet[2570]: W0115 02:00:38.220055 2570 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.1.164:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4515-1-0-n-e5e35ee394&limit=500&resourceVersion=0": dial tcp 10.0.1.164:6443: connect: connection refused Jan 15 02:00:38.220238 kubelet[2570]: E0115 02:00:38.220114 2570 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.1.164:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4515-1-0-n-e5e35ee394&limit=500&resourceVersion=0\": dial tcp 10.0.1.164:6443: connect: connection refused" logger="UnhandledError" Jan 15 02:00:38.224706 kubelet[2570]: W0115 02:00:38.224662 2570 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.1.164:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.1.164:6443: connect: connection refused Jan 15 02:00:38.225116 kubelet[2570]: E0115 02:00:38.224901 2570 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.1.164:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.1.164:6443: connect: connection refused" logger="UnhandledError" Jan 15 02:00:38.225574 kubelet[2570]: I0115 02:00:38.225545 2570 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 15 02:00:38.226688 kubelet[2570]: I0115 02:00:38.226653 2570 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 15 02:00:38.228371 kubelet[2570]: W0115 02:00:38.228344 2570 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 15 02:00:38.233648 kubelet[2570]: I0115 02:00:38.233617 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 15 02:00:38.233862 kubelet[2570]: I0115 02:00:38.233844 2570 server.go:1287] "Started kubelet" Jan 15 02:00:38.236879 kubelet[2570]: I0115 02:00:38.236848 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 15 02:00:38.242215 kubelet[2570]: E0115 02:00:38.239311 2570 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.1.164:6443/api/v1/namespaces/default/events\": dial tcp 10.0.1.164:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4515-1-0-n-e5e35ee394.188ac50728a6a7f9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4515-1-0-n-e5e35ee394,UID:ci-4515-1-0-n-e5e35ee394,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4515-1-0-n-e5e35ee394,},FirstTimestamp:2026-01-15 02:00:38.233794553 +0000 UTC m=+0.783702764,LastTimestamp:2026-01-15 02:00:38.233794553 +0000 UTC m=+0.783702764,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4515-1-0-n-e5e35ee394,}" Jan 15 02:00:38.248078 kernel: kauditd_printk_skb: 36 callbacks suppressed Jan 15 02:00:38.248164 kernel: audit: type=1325 audit(1768442438.242:344): table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2581 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:38.242000 audit[2581]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2581 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:38.248535 kubelet[2570]: I0115 02:00:38.248514 2570 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 15 02:00:38.249477 kubelet[2570]: I0115 02:00:38.249452 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 15 02:00:38.249837 kubelet[2570]: E0115 02:00:38.249810 2570 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" Jan 15 02:00:38.255362 kernel: audit: type=1300 audit(1768442438.242:344): arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffef7ec8100 a2=0 a3=0 items=0 ppid=2570 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.242000 audit[2581]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffef7ec8100 a2=0 a3=0 items=0 ppid=2570 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.255517 kubelet[2570]: I0115 02:00:38.250004 2570 server.go:479] "Adding debug handlers to kubelet server" Jan 15 02:00:38.257173 kubelet[2570]: I0115 02:00:38.254684 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 15 02:00:38.257173 kubelet[2570]: I0115 02:00:38.254760 2570 reconciler.go:26] "Reconciler: start to sync state" Jan 15 02:00:38.257173 kubelet[2570]: I0115 02:00:38.256458 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 15 02:00:38.257173 kubelet[2570]: I0115 02:00:38.256644 2570 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 15 02:00:38.257173 kubelet[2570]: I0115 02:00:38.256802 2570 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 15 02:00:38.242000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 15 02:00:38.262300 kernel: audit: type=1327 audit(1768442438.242:344): proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 15 02:00:38.263031 kubelet[2570]: E0115 02:00:38.262989 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.1.164:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-n-e5e35ee394?timeout=10s\": dial tcp 10.0.1.164:6443: connect: connection refused" interval="200ms" Jan 15 02:00:38.244000 audit[2582]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2582 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:38.270790 kernel: audit: type=1325 audit(1768442438.244:345): table=filter:43 family=2 entries=1 op=nft_register_chain pid=2582 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:38.270873 kernel: audit: type=1300 audit(1768442438.244:345): arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff5762b2f0 a2=0 a3=0 items=0 ppid=2570 pid=2582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.244000 audit[2582]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff5762b2f0 a2=0 a3=0 items=0 ppid=2570 pid=2582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.271028 kubelet[2570]: I0115 02:00:38.265736 2570 factory.go:221] Registration of the systemd container factory successfully Jan 15 02:00:38.271028 kubelet[2570]: I0115 02:00:38.265801 2570 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 15 02:00:38.272140 kubelet[2570]: I0115 02:00:38.272119 2570 factory.go:221] Registration of the containerd container factory successfully Jan 15 02:00:38.244000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 15 02:00:38.273088 kubelet[2570]: W0115 02:00:38.273032 2570 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.1.164:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.1.164:6443: connect: connection refused Jan 15 02:00:38.275280 kernel: audit: type=1327 audit(1768442438.244:345): proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 15 02:00:38.275361 kubelet[2570]: E0115 02:00:38.275190 2570 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.1.164:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.1.164:6443: connect: connection refused" logger="UnhandledError" Jan 15 02:00:38.251000 audit[2584]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2584 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:38.251000 audit[2584]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fff87648e40 a2=0 a3=0 items=0 ppid=2570 pid=2584 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.281421 kernel: audit: type=1325 audit(1768442438.251:346): table=filter:44 family=2 entries=2 op=nft_register_chain pid=2584 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:38.281505 kernel: audit: type=1300 audit(1768442438.251:346): arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7fff87648e40 a2=0 a3=0 items=0 ppid=2570 pid=2584 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.251000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 15 02:00:38.286167 kernel: audit: type=1327 audit(1768442438.251:346): proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 15 02:00:38.255000 audit[2586]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2586 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:38.288407 kubelet[2570]: I0115 02:00:38.288381 2570 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 15 02:00:38.291176 kernel: audit: type=1325 audit(1768442438.255:347): table=filter:45 family=2 entries=2 op=nft_register_chain pid=2586 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:38.255000 audit[2586]: SYSCALL arch=c000003e syscall=46 success=yes exit=340 a0=3 a1=7ffe64acd390 a2=0 a3=0 items=0 ppid=2570 pid=2586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.255000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 15 02:00:38.275000 audit[2591]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2591 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:38.275000 audit[2591]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7fffc6511740 a2=0 a3=0 items=0 ppid=2570 pid=2591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.275000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 15 02:00:38.290000 audit[2595]: NETFILTER_CFG table=mangle:47 family=2 entries=1 op=nft_register_chain pid=2595 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:38.290000 audit[2595]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe8d9cfd80 a2=0 a3=0 items=0 ppid=2570 pid=2595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.290000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 15 02:00:38.291000 audit[2594]: NETFILTER_CFG table=mangle:48 family=10 entries=2 op=nft_register_chain pid=2594 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:38.293000 audit[2596]: NETFILTER_CFG table=nat:49 family=2 entries=1 op=nft_register_chain pid=2596 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:38.293000 audit[2596]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffef625f050 a2=0 a3=0 items=0 ppid=2570 pid=2596 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.293000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 15 02:00:38.291000 audit[2594]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffe96f82420 a2=0 a3=0 items=0 ppid=2570 pid=2594 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.291000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 15 02:00:38.296000 audit[2599]: NETFILTER_CFG table=filter:50 family=2 entries=1 op=nft_register_chain pid=2599 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:38.296000 audit[2599]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffee16bb5d0 a2=0 a3=0 items=0 ppid=2570 pid=2599 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.296000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 15 02:00:38.299323 kubelet[2570]: I0115 02:00:38.299303 2570 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 15 02:00:38.299410 kubelet[2570]: I0115 02:00:38.299401 2570 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 15 02:00:38.299485 kubelet[2570]: I0115 02:00:38.299474 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 15 02:00:38.299535 kubelet[2570]: I0115 02:00:38.299528 2570 kubelet.go:2382] "Starting kubelet main sync loop" Jan 15 02:00:38.299664 kubelet[2570]: E0115 02:00:38.299629 2570 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 15 02:00:38.299000 audit[2600]: NETFILTER_CFG table=mangle:51 family=10 entries=1 op=nft_register_chain pid=2600 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:38.299000 audit[2600]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe1daa7360 a2=0 a3=0 items=0 ppid=2570 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.299000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 15 02:00:38.303190 kubelet[2570]: W0115 02:00:38.302987 2570 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.1.164:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.1.164:6443: connect: connection refused Jan 15 02:00:38.303299 kubelet[2570]: E0115 02:00:38.303280 2570 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.1.164:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.1.164:6443: connect: connection refused" logger="UnhandledError" Jan 15 02:00:38.304945 kubelet[2570]: I0115 02:00:38.304928 2570 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 15 02:00:38.305019 kubelet[2570]: I0115 02:00:38.304950 2570 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 15 02:00:38.305019 kubelet[2570]: I0115 02:00:38.304963 2570 state_mem.go:36] "Initialized new in-memory state store" Jan 15 02:00:38.304000 audit[2601]: NETFILTER_CFG table=nat:52 family=10 entries=1 op=nft_register_chain pid=2601 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:38.304000 audit[2601]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffef7009ea0 a2=0 a3=0 items=0 ppid=2570 pid=2601 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.304000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 15 02:00:38.305000 audit[2602]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2602 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:38.305000 audit[2602]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffdaa956580 a2=0 a3=0 items=0 ppid=2570 pid=2602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.305000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 15 02:00:38.308134 kubelet[2570]: I0115 02:00:38.308121 2570 policy_none.go:49] "None policy: Start" Jan 15 02:00:38.308219 kubelet[2570]: I0115 02:00:38.308142 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 15 02:00:38.308219 kubelet[2570]: I0115 02:00:38.308163 2570 state_mem.go:35] "Initializing new in-memory state store" Jan 15 02:00:38.315535 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 15 02:00:38.331312 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 15 02:00:38.350792 kubelet[2570]: E0115 02:00:38.350769 2570 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" Jan 15 02:00:38.353320 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 15 02:00:38.354813 kubelet[2570]: I0115 02:00:38.354790 2570 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 15 02:00:38.356180 kubelet[2570]: I0115 02:00:38.355831 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 15 02:00:38.356545 kubelet[2570]: I0115 02:00:38.356497 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 15 02:00:38.357464 kubelet[2570]: I0115 02:00:38.357425 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 15 02:00:38.357938 kubelet[2570]: E0115 02:00:38.357853 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 15 02:00:38.357938 kubelet[2570]: E0115 02:00:38.357896 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4515-1-0-n-e5e35ee394\" not found" Jan 15 02:00:38.416667 systemd[1]: Created slice kubepods-burstable-podaf56af4ddbbc1c9aec0354e53705e667.slice - libcontainer container kubepods-burstable-podaf56af4ddbbc1c9aec0354e53705e667.slice. Jan 15 02:00:38.451971 kubelet[2570]: E0115 02:00:38.451928 2570 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:38.456969 kubelet[2570]: I0115 02:00:38.456882 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/af56af4ddbbc1c9aec0354e53705e667-ca-certs\") pod \"kube-apiserver-ci-4515-1-0-n-e5e35ee394\" (UID: \"af56af4ddbbc1c9aec0354e53705e667\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:38.457767 kubelet[2570]: I0115 02:00:38.457217 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/19f3b46049e229f301b301cf890c2f93-flexvolume-dir\") pod \"kube-controller-manager-ci-4515-1-0-n-e5e35ee394\" (UID: \"19f3b46049e229f301b301cf890c2f93\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:38.458185 kubelet[2570]: I0115 02:00:38.457951 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/19f3b46049e229f301b301cf890c2f93-kubeconfig\") pod \"kube-controller-manager-ci-4515-1-0-n-e5e35ee394\" (UID: \"19f3b46049e229f301b301cf890c2f93\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:38.458185 kubelet[2570]: I0115 02:00:38.458074 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3b4f910303ecc9da3f9c354a9cf7da31-kubeconfig\") pod \"kube-scheduler-ci-4515-1-0-n-e5e35ee394\" (UID: \"3b4f910303ecc9da3f9c354a9cf7da31\") " pod="kube-system/kube-scheduler-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:38.458185 kubelet[2570]: I0115 02:00:38.458118 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/af56af4ddbbc1c9aec0354e53705e667-k8s-certs\") pod \"kube-apiserver-ci-4515-1-0-n-e5e35ee394\" (UID: \"af56af4ddbbc1c9aec0354e53705e667\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:38.458565 kubelet[2570]: I0115 02:00:38.458485 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/af56af4ddbbc1c9aec0354e53705e667-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4515-1-0-n-e5e35ee394\" (UID: \"af56af4ddbbc1c9aec0354e53705e667\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:38.459242 kubelet[2570]: I0115 02:00:38.459094 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/19f3b46049e229f301b301cf890c2f93-ca-certs\") pod \"kube-controller-manager-ci-4515-1-0-n-e5e35ee394\" (UID: \"19f3b46049e229f301b301cf890c2f93\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:38.460014 kubelet[2570]: I0115 02:00:38.459142 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/19f3b46049e229f301b301cf890c2f93-k8s-certs\") pod \"kube-controller-manager-ci-4515-1-0-n-e5e35ee394\" (UID: \"19f3b46049e229f301b301cf890c2f93\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:38.460014 kubelet[2570]: I0115 02:00:38.459441 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/19f3b46049e229f301b301cf890c2f93-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4515-1-0-n-e5e35ee394\" (UID: \"19f3b46049e229f301b301cf890c2f93\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:38.462257 kubelet[2570]: I0115 02:00:38.461749 2570 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:38.462595 systemd[1]: Created slice kubepods-burstable-pod19f3b46049e229f301b301cf890c2f93.slice - libcontainer container kubepods-burstable-pod19f3b46049e229f301b301cf890c2f93.slice. Jan 15 02:00:38.462775 kubelet[2570]: E0115 02:00:38.462743 2570 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.1.164:6443/api/v1/nodes\": dial tcp 10.0.1.164:6443: connect: connection refused" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:38.466285 kubelet[2570]: E0115 02:00:38.465194 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.1.164:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-n-e5e35ee394?timeout=10s\": dial tcp 10.0.1.164:6443: connect: connection refused" interval="400ms" Jan 15 02:00:38.468363 kubelet[2570]: E0115 02:00:38.468281 2570 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:38.474964 systemd[1]: Created slice kubepods-burstable-pod3b4f910303ecc9da3f9c354a9cf7da31.slice - libcontainer container kubepods-burstable-pod3b4f910303ecc9da3f9c354a9cf7da31.slice. Jan 15 02:00:38.478146 kubelet[2570]: E0115 02:00:38.478085 2570 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:38.666927 kubelet[2570]: I0115 02:00:38.666869 2570 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:38.668293 kubelet[2570]: E0115 02:00:38.668242 2570 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.1.164:6443/api/v1/nodes\": dial tcp 10.0.1.164:6443: connect: connection refused" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:38.754265 containerd[1710]: time="2026-01-15T02:00:38.754103523Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4515-1-0-n-e5e35ee394,Uid:af56af4ddbbc1c9aec0354e53705e667,Namespace:kube-system,Attempt:0,}" Jan 15 02:00:38.771372 containerd[1710]: time="2026-01-15T02:00:38.771245924Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4515-1-0-n-e5e35ee394,Uid:19f3b46049e229f301b301cf890c2f93,Namespace:kube-system,Attempt:0,}" Jan 15 02:00:38.780030 containerd[1710]: time="2026-01-15T02:00:38.779871087Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4515-1-0-n-e5e35ee394,Uid:3b4f910303ecc9da3f9c354a9cf7da31,Namespace:kube-system,Attempt:0,}" Jan 15 02:00:38.857066 containerd[1710]: time="2026-01-15T02:00:38.856965850Z" level=info msg="connecting to shim c89d4de40db64204e5f909e91cde9ba6c6f571506d7dae20dc8510e59d6f1e8e" address="unix:///run/containerd/s/2475164f96819ef3d681a6c18859ac5ab95b44fe3255059d198443aa49589c12" namespace=k8s.io protocol=ttrpc version=3 Jan 15 02:00:38.867685 containerd[1710]: time="2026-01-15T02:00:38.863963483Z" level=info msg="connecting to shim f30957df3dfc3eddd4611bdfccbb993a6d83e1b8f4b51dda6b704ed7c7e1cf65" address="unix:///run/containerd/s/914bb10fe30b5fb42a9a394bb05b0e70b5e1504ebad441ce89d342409a3042b6" namespace=k8s.io protocol=ttrpc version=3 Jan 15 02:00:38.867821 kubelet[2570]: E0115 02:00:38.867635 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.1.164:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-n-e5e35ee394?timeout=10s\": dial tcp 10.0.1.164:6443: connect: connection refused" interval="800ms" Jan 15 02:00:38.907197 containerd[1710]: time="2026-01-15T02:00:38.907161698Z" level=info msg="connecting to shim 368877fc110bdaaabf4203d2a4de96ed084f76d9b7d9118ae2025597af63e645" address="unix:///run/containerd/s/f546dac5fa5548fb4050c4a6b29d4f333644e9f5d3dac450259ff93710d6f106" namespace=k8s.io protocol=ttrpc version=3 Jan 15 02:00:38.911387 systemd[1]: Started cri-containerd-c89d4de40db64204e5f909e91cde9ba6c6f571506d7dae20dc8510e59d6f1e8e.scope - libcontainer container c89d4de40db64204e5f909e91cde9ba6c6f571506d7dae20dc8510e59d6f1e8e. Jan 15 02:00:38.914322 systemd[1]: Started cri-containerd-f30957df3dfc3eddd4611bdfccbb993a6d83e1b8f4b51dda6b704ed7c7e1cf65.scope - libcontainer container f30957df3dfc3eddd4611bdfccbb993a6d83e1b8f4b51dda6b704ed7c7e1cf65. Jan 15 02:00:38.930000 audit: BPF prog-id=83 op=LOAD Jan 15 02:00:38.932000 audit: BPF prog-id=84 op=LOAD Jan 15 02:00:38.932000 audit[2644]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=2629 pid=2644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633303935376466336466633365646464343631316264666363626239 Jan 15 02:00:38.932000 audit: BPF prog-id=84 op=UNLOAD Jan 15 02:00:38.932000 audit[2644]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2629 pid=2644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633303935376466336466633365646464343631316264666363626239 Jan 15 02:00:38.932000 audit: BPF prog-id=85 op=LOAD Jan 15 02:00:38.932000 audit[2644]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=2629 pid=2644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633303935376466336466633365646464343631316264666363626239 Jan 15 02:00:38.932000 audit: BPF prog-id=86 op=LOAD Jan 15 02:00:38.932000 audit[2644]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=2629 pid=2644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633303935376466336466633365646464343631316264666363626239 Jan 15 02:00:38.932000 audit: BPF prog-id=86 op=UNLOAD Jan 15 02:00:38.932000 audit[2644]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2629 pid=2644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633303935376466336466633365646464343631316264666363626239 Jan 15 02:00:38.932000 audit: BPF prog-id=85 op=UNLOAD Jan 15 02:00:38.932000 audit[2644]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2629 pid=2644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633303935376466336466633365646464343631316264666363626239 Jan 15 02:00:38.932000 audit: BPF prog-id=87 op=LOAD Jan 15 02:00:38.932000 audit[2644]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=2629 pid=2644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633303935376466336466633365646464343631316264666363626239 Jan 15 02:00:38.935000 audit: BPF prog-id=88 op=LOAD Jan 15 02:00:38.936000 audit: BPF prog-id=89 op=LOAD Jan 15 02:00:38.936000 audit[2642]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000174238 a2=98 a3=0 items=0 ppid=2612 pid=2642 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.936000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338396434646534306462363432303465356639303965393163646539 Jan 15 02:00:38.936000 audit: BPF prog-id=89 op=UNLOAD Jan 15 02:00:38.936000 audit[2642]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2612 pid=2642 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.936000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338396434646534306462363432303465356639303965393163646539 Jan 15 02:00:38.937000 audit: BPF prog-id=90 op=LOAD Jan 15 02:00:38.937000 audit[2642]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000174488 a2=98 a3=0 items=0 ppid=2612 pid=2642 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.937000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338396434646534306462363432303465356639303965393163646539 Jan 15 02:00:38.937000 audit: BPF prog-id=91 op=LOAD Jan 15 02:00:38.937000 audit[2642]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000174218 a2=98 a3=0 items=0 ppid=2612 pid=2642 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.937000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338396434646534306462363432303465356639303965393163646539 Jan 15 02:00:38.937000 audit: BPF prog-id=91 op=UNLOAD Jan 15 02:00:38.937000 audit[2642]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2612 pid=2642 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.937000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338396434646534306462363432303465356639303965393163646539 Jan 15 02:00:38.937000 audit: BPF prog-id=90 op=UNLOAD Jan 15 02:00:38.937000 audit[2642]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2612 pid=2642 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.937000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338396434646534306462363432303465356639303965393163646539 Jan 15 02:00:38.937000 audit: BPF prog-id=92 op=LOAD Jan 15 02:00:38.937000 audit[2642]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001746e8 a2=98 a3=0 items=0 ppid=2612 pid=2642 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.937000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338396434646534306462363432303465356639303965393163646539 Jan 15 02:00:38.950015 systemd[1]: Started cri-containerd-368877fc110bdaaabf4203d2a4de96ed084f76d9b7d9118ae2025597af63e645.scope - libcontainer container 368877fc110bdaaabf4203d2a4de96ed084f76d9b7d9118ae2025597af63e645. Jan 15 02:00:38.974000 audit: BPF prog-id=93 op=LOAD Jan 15 02:00:38.974000 audit: BPF prog-id=94 op=LOAD Jan 15 02:00:38.974000 audit[2691]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=2676 pid=2691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.974000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336383837376663313130626461616162663432303364326134646539 Jan 15 02:00:38.975000 audit: BPF prog-id=94 op=UNLOAD Jan 15 02:00:38.975000 audit[2691]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2676 pid=2691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.975000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336383837376663313130626461616162663432303364326134646539 Jan 15 02:00:38.975000 audit: BPF prog-id=95 op=LOAD Jan 15 02:00:38.975000 audit[2691]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=2676 pid=2691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.975000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336383837376663313130626461616162663432303364326134646539 Jan 15 02:00:38.975000 audit: BPF prog-id=96 op=LOAD Jan 15 02:00:38.975000 audit[2691]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=2676 pid=2691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.975000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336383837376663313130626461616162663432303364326134646539 Jan 15 02:00:38.975000 audit: BPF prog-id=96 op=UNLOAD Jan 15 02:00:38.975000 audit[2691]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2676 pid=2691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.975000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336383837376663313130626461616162663432303364326134646539 Jan 15 02:00:38.975000 audit: BPF prog-id=95 op=UNLOAD Jan 15 02:00:38.975000 audit[2691]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2676 pid=2691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.975000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336383837376663313130626461616162663432303364326134646539 Jan 15 02:00:38.975000 audit: BPF prog-id=97 op=LOAD Jan 15 02:00:38.975000 audit[2691]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=2676 pid=2691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:38.975000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3336383837376663313130626461616162663432303364326134646539 Jan 15 02:00:38.985615 containerd[1710]: time="2026-01-15T02:00:38.985558122Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4515-1-0-n-e5e35ee394,Uid:19f3b46049e229f301b301cf890c2f93,Namespace:kube-system,Attempt:0,} returns sandbox id \"f30957df3dfc3eddd4611bdfccbb993a6d83e1b8f4b51dda6b704ed7c7e1cf65\"" Jan 15 02:00:38.988239 containerd[1710]: time="2026-01-15T02:00:38.988047373Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4515-1-0-n-e5e35ee394,Uid:af56af4ddbbc1c9aec0354e53705e667,Namespace:kube-system,Attempt:0,} returns sandbox id \"c89d4de40db64204e5f909e91cde9ba6c6f571506d7dae20dc8510e59d6f1e8e\"" Jan 15 02:00:38.991372 containerd[1710]: time="2026-01-15T02:00:38.991338470Z" level=info msg="CreateContainer within sandbox \"f30957df3dfc3eddd4611bdfccbb993a6d83e1b8f4b51dda6b704ed7c7e1cf65\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 15 02:00:38.992760 containerd[1710]: time="2026-01-15T02:00:38.992170239Z" level=info msg="CreateContainer within sandbox \"c89d4de40db64204e5f909e91cde9ba6c6f571506d7dae20dc8510e59d6f1e8e\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 15 02:00:39.009050 containerd[1710]: time="2026-01-15T02:00:39.008246675Z" level=info msg="Container b929c2ce0fb073b45b809f8ba521cde702e196f247f2bd1ce6352c54af0d5190: CDI devices from CRI Config.CDIDevices: []" Jan 15 02:00:39.010540 containerd[1710]: time="2026-01-15T02:00:39.010520411Z" level=info msg="Container 292d93f9bedd7bfe811ed741d35f91652f0ab24b7b978b657f747e8aa7910209: CDI devices from CRI Config.CDIDevices: []" Jan 15 02:00:39.021381 containerd[1710]: time="2026-01-15T02:00:39.021363136Z" level=info msg="CreateContainer within sandbox \"c89d4de40db64204e5f909e91cde9ba6c6f571506d7dae20dc8510e59d6f1e8e\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"b929c2ce0fb073b45b809f8ba521cde702e196f247f2bd1ce6352c54af0d5190\"" Jan 15 02:00:39.021892 containerd[1710]: time="2026-01-15T02:00:39.021878280Z" level=info msg="StartContainer for \"b929c2ce0fb073b45b809f8ba521cde702e196f247f2bd1ce6352c54af0d5190\"" Jan 15 02:00:39.023166 containerd[1710]: time="2026-01-15T02:00:39.023132953Z" level=info msg="connecting to shim b929c2ce0fb073b45b809f8ba521cde702e196f247f2bd1ce6352c54af0d5190" address="unix:///run/containerd/s/2475164f96819ef3d681a6c18859ac5ab95b44fe3255059d198443aa49589c12" protocol=ttrpc version=3 Jan 15 02:00:39.028571 kubelet[2570]: W0115 02:00:39.028534 2570 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.1.164:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4515-1-0-n-e5e35ee394&limit=500&resourceVersion=0": dial tcp 10.0.1.164:6443: connect: connection refused Jan 15 02:00:39.028775 kubelet[2570]: E0115 02:00:39.028761 2570 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.1.164:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4515-1-0-n-e5e35ee394&limit=500&resourceVersion=0\": dial tcp 10.0.1.164:6443: connect: connection refused" logger="UnhandledError" Jan 15 02:00:39.029239 containerd[1710]: time="2026-01-15T02:00:39.029215625Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4515-1-0-n-e5e35ee394,Uid:3b4f910303ecc9da3f9c354a9cf7da31,Namespace:kube-system,Attempt:0,} returns sandbox id \"368877fc110bdaaabf4203d2a4de96ed084f76d9b7d9118ae2025597af63e645\"" Jan 15 02:00:39.030997 containerd[1710]: time="2026-01-15T02:00:39.030980564Z" level=info msg="CreateContainer within sandbox \"368877fc110bdaaabf4203d2a4de96ed084f76d9b7d9118ae2025597af63e645\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 15 02:00:39.038544 containerd[1710]: time="2026-01-15T02:00:39.038516577Z" level=info msg="CreateContainer within sandbox \"f30957df3dfc3eddd4611bdfccbb993a6d83e1b8f4b51dda6b704ed7c7e1cf65\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"292d93f9bedd7bfe811ed741d35f91652f0ab24b7b978b657f747e8aa7910209\"" Jan 15 02:00:39.038961 containerd[1710]: time="2026-01-15T02:00:39.038942734Z" level=info msg="StartContainer for \"292d93f9bedd7bfe811ed741d35f91652f0ab24b7b978b657f747e8aa7910209\"" Jan 15 02:00:39.039873 containerd[1710]: time="2026-01-15T02:00:39.039839567Z" level=info msg="connecting to shim 292d93f9bedd7bfe811ed741d35f91652f0ab24b7b978b657f747e8aa7910209" address="unix:///run/containerd/s/914bb10fe30b5fb42a9a394bb05b0e70b5e1504ebad441ce89d342409a3042b6" protocol=ttrpc version=3 Jan 15 02:00:39.040317 systemd[1]: Started cri-containerd-b929c2ce0fb073b45b809f8ba521cde702e196f247f2bd1ce6352c54af0d5190.scope - libcontainer container b929c2ce0fb073b45b809f8ba521cde702e196f247f2bd1ce6352c54af0d5190. Jan 15 02:00:39.047348 containerd[1710]: time="2026-01-15T02:00:39.047327556Z" level=info msg="Container 177a36f38e40737213a2dc6fdbb816fea589dea8e2ec742414219dc4fea3743f: CDI devices from CRI Config.CDIDevices: []" Jan 15 02:00:39.059301 systemd[1]: Started cri-containerd-292d93f9bedd7bfe811ed741d35f91652f0ab24b7b978b657f747e8aa7910209.scope - libcontainer container 292d93f9bedd7bfe811ed741d35f91652f0ab24b7b978b657f747e8aa7910209. Jan 15 02:00:39.059897 containerd[1710]: time="2026-01-15T02:00:39.059874819Z" level=info msg="CreateContainer within sandbox \"368877fc110bdaaabf4203d2a4de96ed084f76d9b7d9118ae2025597af63e645\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"177a36f38e40737213a2dc6fdbb816fea589dea8e2ec742414219dc4fea3743f\"" Jan 15 02:00:39.059000 audit: BPF prog-id=98 op=LOAD Jan 15 02:00:39.061214 containerd[1710]: time="2026-01-15T02:00:39.060142660Z" level=info msg="StartContainer for \"177a36f38e40737213a2dc6fdbb816fea589dea8e2ec742414219dc4fea3743f\"" Jan 15 02:00:39.060000 audit: BPF prog-id=99 op=LOAD Jan 15 02:00:39.060000 audit[2740]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178238 a2=98 a3=0 items=0 ppid=2612 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.060000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239323963326365306662303733623435623830396638626135323163 Jan 15 02:00:39.060000 audit: BPF prog-id=99 op=UNLOAD Jan 15 02:00:39.060000 audit[2740]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2612 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.060000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239323963326365306662303733623435623830396638626135323163 Jan 15 02:00:39.060000 audit: BPF prog-id=100 op=LOAD Jan 15 02:00:39.060000 audit[2740]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=2612 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.060000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239323963326365306662303733623435623830396638626135323163 Jan 15 02:00:39.060000 audit: BPF prog-id=101 op=LOAD Jan 15 02:00:39.060000 audit[2740]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=2612 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.060000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239323963326365306662303733623435623830396638626135323163 Jan 15 02:00:39.060000 audit: BPF prog-id=101 op=UNLOAD Jan 15 02:00:39.060000 audit[2740]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2612 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.060000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239323963326365306662303733623435623830396638626135323163 Jan 15 02:00:39.060000 audit: BPF prog-id=100 op=UNLOAD Jan 15 02:00:39.060000 audit[2740]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2612 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.060000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239323963326365306662303733623435623830396638626135323163 Jan 15 02:00:39.062438 containerd[1710]: time="2026-01-15T02:00:39.061971822Z" level=info msg="connecting to shim 177a36f38e40737213a2dc6fdbb816fea589dea8e2ec742414219dc4fea3743f" address="unix:///run/containerd/s/f546dac5fa5548fb4050c4a6b29d4f333644e9f5d3dac450259ff93710d6f106" protocol=ttrpc version=3 Jan 15 02:00:39.061000 audit: BPF prog-id=102 op=LOAD Jan 15 02:00:39.061000 audit[2740]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001786e8 a2=98 a3=0 items=0 ppid=2612 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.061000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239323963326365306662303733623435623830396638626135323163 Jan 15 02:00:39.071689 kubelet[2570]: I0115 02:00:39.071627 2570 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:39.072885 kubelet[2570]: E0115 02:00:39.072630 2570 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.1.164:6443/api/v1/nodes\": dial tcp 10.0.1.164:6443: connect: connection refused" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:39.076000 audit: BPF prog-id=103 op=LOAD Jan 15 02:00:39.076000 audit: BPF prog-id=104 op=LOAD Jan 15 02:00:39.076000 audit[2754]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=2629 pid=2754 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3239326439336639626564643762666538313165643734316433356639 Jan 15 02:00:39.076000 audit: BPF prog-id=104 op=UNLOAD Jan 15 02:00:39.076000 audit[2754]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2629 pid=2754 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3239326439336639626564643762666538313165643734316433356639 Jan 15 02:00:39.076000 audit: BPF prog-id=105 op=LOAD Jan 15 02:00:39.076000 audit[2754]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=2629 pid=2754 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3239326439336639626564643762666538313165643734316433356639 Jan 15 02:00:39.076000 audit: BPF prog-id=106 op=LOAD Jan 15 02:00:39.076000 audit[2754]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=2629 pid=2754 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3239326439336639626564643762666538313165643734316433356639 Jan 15 02:00:39.077000 audit: BPF prog-id=106 op=UNLOAD Jan 15 02:00:39.077000 audit[2754]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2629 pid=2754 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.077000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3239326439336639626564643762666538313165643734316433356639 Jan 15 02:00:39.077000 audit: BPF prog-id=105 op=UNLOAD Jan 15 02:00:39.077000 audit[2754]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2629 pid=2754 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.077000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3239326439336639626564643762666538313165643734316433356639 Jan 15 02:00:39.077000 audit: BPF prog-id=107 op=LOAD Jan 15 02:00:39.077000 audit[2754]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=2629 pid=2754 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.077000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3239326439336639626564643762666538313165643734316433356639 Jan 15 02:00:39.093514 systemd[1]: Started cri-containerd-177a36f38e40737213a2dc6fdbb816fea589dea8e2ec742414219dc4fea3743f.scope - libcontainer container 177a36f38e40737213a2dc6fdbb816fea589dea8e2ec742414219dc4fea3743f. Jan 15 02:00:39.114000 audit: BPF prog-id=108 op=LOAD Jan 15 02:00:39.115000 audit: BPF prog-id=109 op=LOAD Jan 15 02:00:39.115000 audit[2775]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8238 a2=98 a3=0 items=0 ppid=2676 pid=2775 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.115000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137376133366633386534303733373231336132646336666462623831 Jan 15 02:00:39.115000 audit: BPF prog-id=109 op=UNLOAD Jan 15 02:00:39.115000 audit[2775]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2676 pid=2775 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.115000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137376133366633386534303733373231336132646336666462623831 Jan 15 02:00:39.115000 audit: BPF prog-id=110 op=LOAD Jan 15 02:00:39.115000 audit[2775]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a8488 a2=98 a3=0 items=0 ppid=2676 pid=2775 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.115000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137376133366633386534303733373231336132646336666462623831 Jan 15 02:00:39.115000 audit: BPF prog-id=111 op=LOAD Jan 15 02:00:39.115000 audit[2775]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a8218 a2=98 a3=0 items=0 ppid=2676 pid=2775 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.115000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137376133366633386534303733373231336132646336666462623831 Jan 15 02:00:39.115000 audit: BPF prog-id=111 op=UNLOAD Jan 15 02:00:39.115000 audit[2775]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2676 pid=2775 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.115000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137376133366633386534303733373231336132646336666462623831 Jan 15 02:00:39.115000 audit: BPF prog-id=110 op=UNLOAD Jan 15 02:00:39.115000 audit[2775]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2676 pid=2775 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.115000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137376133366633386534303733373231336132646336666462623831 Jan 15 02:00:39.115000 audit: BPF prog-id=112 op=LOAD Jan 15 02:00:39.115000 audit[2775]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a86e8 a2=98 a3=0 items=0 ppid=2676 pid=2775 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:39.115000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3137376133366633386534303733373231336132646336666462623831 Jan 15 02:00:39.133491 containerd[1710]: time="2026-01-15T02:00:39.133408273Z" level=info msg="StartContainer for \"b929c2ce0fb073b45b809f8ba521cde702e196f247f2bd1ce6352c54af0d5190\" returns successfully" Jan 15 02:00:39.134687 containerd[1710]: time="2026-01-15T02:00:39.134665267Z" level=info msg="StartContainer for \"292d93f9bedd7bfe811ed741d35f91652f0ab24b7b978b657f747e8aa7910209\" returns successfully" Jan 15 02:00:39.208649 containerd[1710]: time="2026-01-15T02:00:39.208616957Z" level=info msg="StartContainer for \"177a36f38e40737213a2dc6fdbb816fea589dea8e2ec742414219dc4fea3743f\" returns successfully" Jan 15 02:00:39.310666 kubelet[2570]: E0115 02:00:39.310461 2570 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:39.314305 kubelet[2570]: E0115 02:00:39.314293 2570 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:39.314867 kubelet[2570]: E0115 02:00:39.314832 2570 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:39.874604 kubelet[2570]: I0115 02:00:39.874578 2570 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:40.316916 kubelet[2570]: E0115 02:00:40.316700 2570 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:40.317200 kubelet[2570]: E0115 02:00:40.317091 2570 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:40.709147 kubelet[2570]: E0115 02:00:40.709063 2570 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4515-1-0-n-e5e35ee394\" not found" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:40.794222 kubelet[2570]: I0115 02:00:40.794098 2570 kubelet_node_status.go:78] "Successfully registered node" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:40.794222 kubelet[2570]: E0115 02:00:40.794125 2570 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4515-1-0-n-e5e35ee394\": node \"ci-4515-1-0-n-e5e35ee394\" not found" Jan 15 02:00:40.809961 kubelet[2570]: E0115 02:00:40.809938 2570 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" Jan 15 02:00:40.910500 kubelet[2570]: E0115 02:00:40.910468 2570 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" Jan 15 02:00:41.011443 kubelet[2570]: E0115 02:00:41.011250 2570 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" Jan 15 02:00:41.112224 kubelet[2570]: E0115 02:00:41.112113 2570 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" Jan 15 02:00:41.213086 kubelet[2570]: E0115 02:00:41.213017 2570 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" Jan 15 02:00:41.314032 kubelet[2570]: E0115 02:00:41.313971 2570 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" Jan 15 02:00:41.414345 kubelet[2570]: E0115 02:00:41.414204 2570 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" Jan 15 02:00:41.515377 kubelet[2570]: E0115 02:00:41.515322 2570 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" Jan 15 02:00:41.616482 kubelet[2570]: E0115 02:00:41.616328 2570 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" Jan 15 02:00:41.716588 kubelet[2570]: E0115 02:00:41.716529 2570 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" Jan 15 02:00:41.851014 kubelet[2570]: I0115 02:00:41.850951 2570 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:41.869195 kubelet[2570]: I0115 02:00:41.867065 2570 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:41.878809 kubelet[2570]: I0115 02:00:41.878626 2570 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:42.227136 kubelet[2570]: I0115 02:00:42.226509 2570 apiserver.go:52] "Watching apiserver" Jan 15 02:00:42.257416 kubelet[2570]: I0115 02:00:42.257359 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 15 02:00:43.070911 systemd[1]: Reload requested from client PID 2839 ('systemctl') (unit session-9.scope)... Jan 15 02:00:43.070946 systemd[1]: Reloading... Jan 15 02:00:43.204206 zram_generator::config[2891]: No configuration found. Jan 15 02:00:43.393041 systemd[1]: Reloading finished in 321 ms. Jan 15 02:00:43.437480 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 02:00:43.452574 systemd[1]: kubelet.service: Deactivated successfully. Jan 15 02:00:43.452872 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 02:00:43.457281 kernel: kauditd_printk_skb: 158 callbacks suppressed Jan 15 02:00:43.457340 kernel: audit: type=1131 audit(1768442443.451:404): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:43.451000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:43.453759 systemd[1]: kubelet.service: Consumed 1.255s CPU time, 131.6M memory peak. Jan 15 02:00:43.460019 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 15 02:00:43.461000 audit: BPF prog-id=113 op=LOAD Jan 15 02:00:43.464163 kernel: audit: type=1334 audit(1768442443.461:405): prog-id=113 op=LOAD Jan 15 02:00:43.461000 audit: BPF prog-id=64 op=UNLOAD Jan 15 02:00:43.467176 kernel: audit: type=1334 audit(1768442443.461:406): prog-id=64 op=UNLOAD Jan 15 02:00:43.473000 audit: BPF prog-id=114 op=LOAD Jan 15 02:00:43.477182 kernel: audit: type=1334 audit(1768442443.473:407): prog-id=114 op=LOAD Jan 15 02:00:43.477224 kernel: audit: type=1334 audit(1768442443.473:408): prog-id=65 op=UNLOAD Jan 15 02:00:43.473000 audit: BPF prog-id=65 op=UNLOAD Jan 15 02:00:43.473000 audit: BPF prog-id=115 op=LOAD Jan 15 02:00:43.480214 kernel: audit: type=1334 audit(1768442443.473:409): prog-id=115 op=LOAD Jan 15 02:00:43.480252 kernel: audit: type=1334 audit(1768442443.473:410): prog-id=116 op=LOAD Jan 15 02:00:43.473000 audit: BPF prog-id=116 op=LOAD Jan 15 02:00:43.481717 kernel: audit: type=1334 audit(1768442443.473:411): prog-id=66 op=UNLOAD Jan 15 02:00:43.473000 audit: BPF prog-id=66 op=UNLOAD Jan 15 02:00:43.483241 kernel: audit: type=1334 audit(1768442443.473:412): prog-id=67 op=UNLOAD Jan 15 02:00:43.473000 audit: BPF prog-id=67 op=UNLOAD Jan 15 02:00:43.484709 kernel: audit: type=1334 audit(1768442443.476:413): prog-id=117 op=LOAD Jan 15 02:00:43.476000 audit: BPF prog-id=117 op=LOAD Jan 15 02:00:43.476000 audit: BPF prog-id=71 op=UNLOAD Jan 15 02:00:43.478000 audit: BPF prog-id=118 op=LOAD Jan 15 02:00:43.478000 audit: BPF prog-id=74 op=UNLOAD Jan 15 02:00:43.478000 audit: BPF prog-id=119 op=LOAD Jan 15 02:00:43.478000 audit: BPF prog-id=120 op=LOAD Jan 15 02:00:43.478000 audit: BPF prog-id=75 op=UNLOAD Jan 15 02:00:43.478000 audit: BPF prog-id=76 op=UNLOAD Jan 15 02:00:43.482000 audit: BPF prog-id=121 op=LOAD Jan 15 02:00:43.482000 audit: BPF prog-id=77 op=UNLOAD Jan 15 02:00:43.482000 audit: BPF prog-id=122 op=LOAD Jan 15 02:00:43.482000 audit: BPF prog-id=123 op=LOAD Jan 15 02:00:43.482000 audit: BPF prog-id=78 op=UNLOAD Jan 15 02:00:43.482000 audit: BPF prog-id=79 op=UNLOAD Jan 15 02:00:43.484000 audit: BPF prog-id=124 op=LOAD Jan 15 02:00:43.484000 audit: BPF prog-id=68 op=UNLOAD Jan 15 02:00:43.484000 audit: BPF prog-id=125 op=LOAD Jan 15 02:00:43.484000 audit: BPF prog-id=126 op=LOAD Jan 15 02:00:43.484000 audit: BPF prog-id=69 op=UNLOAD Jan 15 02:00:43.484000 audit: BPF prog-id=70 op=UNLOAD Jan 15 02:00:43.486000 audit: BPF prog-id=127 op=LOAD Jan 15 02:00:43.486000 audit: BPF prog-id=63 op=UNLOAD Jan 15 02:00:43.488000 audit: BPF prog-id=128 op=LOAD Jan 15 02:00:43.488000 audit: BPF prog-id=80 op=UNLOAD Jan 15 02:00:43.488000 audit: BPF prog-id=129 op=LOAD Jan 15 02:00:43.488000 audit: BPF prog-id=130 op=LOAD Jan 15 02:00:43.488000 audit: BPF prog-id=81 op=UNLOAD Jan 15 02:00:43.488000 audit: BPF prog-id=82 op=UNLOAD Jan 15 02:00:43.489000 audit: BPF prog-id=131 op=LOAD Jan 15 02:00:43.489000 audit: BPF prog-id=132 op=LOAD Jan 15 02:00:43.489000 audit: BPF prog-id=72 op=UNLOAD Jan 15 02:00:43.489000 audit: BPF prog-id=73 op=UNLOAD Jan 15 02:00:43.608859 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 15 02:00:43.609000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:00:43.617576 (kubelet)[2935]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 15 02:00:43.687620 kubelet[2935]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 02:00:43.690557 kubelet[2935]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 15 02:00:43.690557 kubelet[2935]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 15 02:00:43.690557 kubelet[2935]: I0115 02:00:43.688555 2935 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 15 02:00:43.698575 kubelet[2935]: I0115 02:00:43.698556 2935 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 15 02:00:43.698660 kubelet[2935]: I0115 02:00:43.698653 2935 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 15 02:00:43.698882 kubelet[2935]: I0115 02:00:43.698873 2935 server.go:954] "Client rotation is on, will bootstrap in background" Jan 15 02:00:43.699896 kubelet[2935]: I0115 02:00:43.699884 2935 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 15 02:00:43.701844 kubelet[2935]: I0115 02:00:43.701828 2935 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 15 02:00:43.706202 kubelet[2935]: I0115 02:00:43.706190 2935 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 15 02:00:43.708561 kubelet[2935]: I0115 02:00:43.708549 2935 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 15 02:00:43.708806 kubelet[2935]: I0115 02:00:43.708786 2935 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 15 02:00:43.709115 kubelet[2935]: I0115 02:00:43.708846 2935 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4515-1-0-n-e5e35ee394","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 15 02:00:43.709264 kubelet[2935]: I0115 02:00:43.709255 2935 topology_manager.go:138] "Creating topology manager with none policy" Jan 15 02:00:43.709304 kubelet[2935]: I0115 02:00:43.709300 2935 container_manager_linux.go:304] "Creating device plugin manager" Jan 15 02:00:43.709418 kubelet[2935]: I0115 02:00:43.709413 2935 state_mem.go:36] "Initialized new in-memory state store" Jan 15 02:00:43.709586 kubelet[2935]: I0115 02:00:43.709579 2935 kubelet.go:446] "Attempting to sync node with API server" Jan 15 02:00:43.709638 kubelet[2935]: I0115 02:00:43.709633 2935 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 15 02:00:43.709686 kubelet[2935]: I0115 02:00:43.709681 2935 kubelet.go:352] "Adding apiserver pod source" Jan 15 02:00:43.709719 kubelet[2935]: I0115 02:00:43.709715 2935 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 15 02:00:43.711312 kubelet[2935]: I0115 02:00:43.711300 2935 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 15 02:00:43.711698 kubelet[2935]: I0115 02:00:43.711689 2935 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 15 02:00:43.712460 kubelet[2935]: I0115 02:00:43.712449 2935 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 15 02:00:43.712529 kubelet[2935]: I0115 02:00:43.712524 2935 server.go:1287] "Started kubelet" Jan 15 02:00:43.714726 kubelet[2935]: I0115 02:00:43.714709 2935 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 15 02:00:43.720968 kubelet[2935]: I0115 02:00:43.720930 2935 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 15 02:00:43.721964 kubelet[2935]: I0115 02:00:43.721944 2935 server.go:479] "Adding debug handlers to kubelet server" Jan 15 02:00:43.722822 kubelet[2935]: I0115 02:00:43.722793 2935 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 15 02:00:43.723030 kubelet[2935]: I0115 02:00:43.723023 2935 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 15 02:00:43.723232 kubelet[2935]: I0115 02:00:43.723222 2935 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 15 02:00:43.725867 kubelet[2935]: I0115 02:00:43.725856 2935 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 15 02:00:43.726070 kubelet[2935]: E0115 02:00:43.726058 2935 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4515-1-0-n-e5e35ee394\" not found" Jan 15 02:00:43.730848 kubelet[2935]: I0115 02:00:43.730834 2935 factory.go:221] Registration of the systemd container factory successfully Jan 15 02:00:43.730974 kubelet[2935]: I0115 02:00:43.730962 2935 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 15 02:00:43.731747 kubelet[2935]: I0115 02:00:43.731738 2935 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 15 02:00:43.731905 kubelet[2935]: I0115 02:00:43.731897 2935 reconciler.go:26] "Reconciler: start to sync state" Jan 15 02:00:43.735128 kubelet[2935]: I0115 02:00:43.735107 2935 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 15 02:00:43.736092 kubelet[2935]: I0115 02:00:43.736079 2935 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 15 02:00:43.736206 kubelet[2935]: I0115 02:00:43.736199 2935 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 15 02:00:43.736264 kubelet[2935]: I0115 02:00:43.736250 2935 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 15 02:00:43.736319 kubelet[2935]: I0115 02:00:43.736314 2935 kubelet.go:2382] "Starting kubelet main sync loop" Jan 15 02:00:43.736542 kubelet[2935]: E0115 02:00:43.736489 2935 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 15 02:00:43.741211 kubelet[2935]: I0115 02:00:43.741190 2935 factory.go:221] Registration of the containerd container factory successfully Jan 15 02:00:43.754712 kubelet[2935]: E0115 02:00:43.754321 2935 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 15 02:00:43.790058 kubelet[2935]: I0115 02:00:43.790039 2935 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 15 02:00:43.790058 kubelet[2935]: I0115 02:00:43.790052 2935 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 15 02:00:43.790058 kubelet[2935]: I0115 02:00:43.790065 2935 state_mem.go:36] "Initialized new in-memory state store" Jan 15 02:00:43.790215 kubelet[2935]: I0115 02:00:43.790205 2935 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 15 02:00:43.790252 kubelet[2935]: I0115 02:00:43.790214 2935 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 15 02:00:43.790252 kubelet[2935]: I0115 02:00:43.790229 2935 policy_none.go:49] "None policy: Start" Jan 15 02:00:43.790252 kubelet[2935]: I0115 02:00:43.790236 2935 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 15 02:00:43.790252 kubelet[2935]: I0115 02:00:43.790245 2935 state_mem.go:35] "Initializing new in-memory state store" Jan 15 02:00:43.790340 kubelet[2935]: I0115 02:00:43.790322 2935 state_mem.go:75] "Updated machine memory state" Jan 15 02:00:43.793644 kubelet[2935]: I0115 02:00:43.793627 2935 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 15 02:00:43.794246 kubelet[2935]: I0115 02:00:43.794214 2935 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 15 02:00:43.794343 kubelet[2935]: I0115 02:00:43.794227 2935 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 15 02:00:43.795886 kubelet[2935]: I0115 02:00:43.795834 2935 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 15 02:00:43.797058 kubelet[2935]: E0115 02:00:43.797044 2935 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 15 02:00:43.837788 kubelet[2935]: I0115 02:00:43.837761 2935 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:43.837922 kubelet[2935]: I0115 02:00:43.837909 2935 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:43.838635 kubelet[2935]: I0115 02:00:43.837815 2935 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:43.847142 kubelet[2935]: E0115 02:00:43.846976 2935 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4515-1-0-n-e5e35ee394\" already exists" pod="kube-system/kube-apiserver-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:43.847568 kubelet[2935]: E0115 02:00:43.847531 2935 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4515-1-0-n-e5e35ee394\" already exists" pod="kube-system/kube-scheduler-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:43.847655 kubelet[2935]: E0115 02:00:43.847646 2935 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4515-1-0-n-e5e35ee394\" already exists" pod="kube-system/kube-controller-manager-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:43.897951 kubelet[2935]: I0115 02:00:43.897894 2935 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:43.907583 kubelet[2935]: I0115 02:00:43.907543 2935 kubelet_node_status.go:124] "Node was previously registered" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:43.907696 kubelet[2935]: I0115 02:00:43.907604 2935 kubelet_node_status.go:78] "Successfully registered node" node="ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:44.033359 kubelet[2935]: I0115 02:00:44.032943 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/19f3b46049e229f301b301cf890c2f93-ca-certs\") pod \"kube-controller-manager-ci-4515-1-0-n-e5e35ee394\" (UID: \"19f3b46049e229f301b301cf890c2f93\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:44.034272 kubelet[2935]: I0115 02:00:44.034226 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/19f3b46049e229f301b301cf890c2f93-flexvolume-dir\") pod \"kube-controller-manager-ci-4515-1-0-n-e5e35ee394\" (UID: \"19f3b46049e229f301b301cf890c2f93\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:44.034947 kubelet[2935]: I0115 02:00:44.034834 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/19f3b46049e229f301b301cf890c2f93-kubeconfig\") pod \"kube-controller-manager-ci-4515-1-0-n-e5e35ee394\" (UID: \"19f3b46049e229f301b301cf890c2f93\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:44.035223 kubelet[2935]: I0115 02:00:44.035102 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/19f3b46049e229f301b301cf890c2f93-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4515-1-0-n-e5e35ee394\" (UID: \"19f3b46049e229f301b301cf890c2f93\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:44.035602 kubelet[2935]: I0115 02:00:44.035509 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3b4f910303ecc9da3f9c354a9cf7da31-kubeconfig\") pod \"kube-scheduler-ci-4515-1-0-n-e5e35ee394\" (UID: \"3b4f910303ecc9da3f9c354a9cf7da31\") " pod="kube-system/kube-scheduler-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:44.035872 kubelet[2935]: I0115 02:00:44.035777 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/af56af4ddbbc1c9aec0354e53705e667-ca-certs\") pod \"kube-apiserver-ci-4515-1-0-n-e5e35ee394\" (UID: \"af56af4ddbbc1c9aec0354e53705e667\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:44.036132 kubelet[2935]: I0115 02:00:44.036084 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/af56af4ddbbc1c9aec0354e53705e667-k8s-certs\") pod \"kube-apiserver-ci-4515-1-0-n-e5e35ee394\" (UID: \"af56af4ddbbc1c9aec0354e53705e667\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:44.036379 kubelet[2935]: I0115 02:00:44.036318 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/af56af4ddbbc1c9aec0354e53705e667-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4515-1-0-n-e5e35ee394\" (UID: \"af56af4ddbbc1c9aec0354e53705e667\") " pod="kube-system/kube-apiserver-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:44.036613 kubelet[2935]: I0115 02:00:44.036527 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/19f3b46049e229f301b301cf890c2f93-k8s-certs\") pod \"kube-controller-manager-ci-4515-1-0-n-e5e35ee394\" (UID: \"19f3b46049e229f301b301cf890c2f93\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:44.729784 kubelet[2935]: I0115 02:00:44.729411 2935 apiserver.go:52] "Watching apiserver" Jan 15 02:00:44.784529 kubelet[2935]: I0115 02:00:44.783583 2935 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:44.784529 kubelet[2935]: I0115 02:00:44.784278 2935 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:44.802191 kubelet[2935]: E0115 02:00:44.800543 2935 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4515-1-0-n-e5e35ee394\" already exists" pod="kube-system/kube-apiserver-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:44.802191 kubelet[2935]: E0115 02:00:44.800882 2935 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4515-1-0-n-e5e35ee394\" already exists" pod="kube-system/kube-scheduler-ci-4515-1-0-n-e5e35ee394" Jan 15 02:00:44.822840 kubelet[2935]: I0115 02:00:44.821611 2935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4515-1-0-n-e5e35ee394" podStartSLOduration=3.821583117 podStartE2EDuration="3.821583117s" podCreationTimestamp="2026-01-15 02:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 02:00:44.819957001 +0000 UTC m=+1.197948875" watchObservedRunningTime="2026-01-15 02:00:44.821583117 +0000 UTC m=+1.199575000" Jan 15 02:00:44.833542 kubelet[2935]: I0115 02:00:44.831960 2935 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 15 02:00:44.833542 kubelet[2935]: I0115 02:00:44.832019 2935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4515-1-0-n-e5e35ee394" podStartSLOduration=3.832006986 podStartE2EDuration="3.832006986s" podCreationTimestamp="2026-01-15 02:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 02:00:44.831651721 +0000 UTC m=+1.209643597" watchObservedRunningTime="2026-01-15 02:00:44.832006986 +0000 UTC m=+1.209998829" Jan 15 02:00:44.843454 kubelet[2935]: I0115 02:00:44.843352 2935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4515-1-0-n-e5e35ee394" podStartSLOduration=3.843327896 podStartE2EDuration="3.843327896s" podCreationTimestamp="2026-01-15 02:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 02:00:44.842270212 +0000 UTC m=+1.220262095" watchObservedRunningTime="2026-01-15 02:00:44.843327896 +0000 UTC m=+1.221319861" Jan 15 02:00:49.697019 kubelet[2935]: I0115 02:00:49.696973 2935 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 15 02:00:49.698613 containerd[1710]: time="2026-01-15T02:00:49.698566284Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 15 02:00:49.699348 kubelet[2935]: I0115 02:00:49.698870 2935 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 15 02:00:50.550263 systemd[1]: Created slice kubepods-besteffort-pod46595ec8_8c70_430f_be21_2beae49943e1.slice - libcontainer container kubepods-besteffort-pod46595ec8_8c70_430f_be21_2beae49943e1.slice. Jan 15 02:00:50.577810 kubelet[2935]: I0115 02:00:50.577728 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/46595ec8-8c70-430f-be21-2beae49943e1-kube-proxy\") pod \"kube-proxy-6p54l\" (UID: \"46595ec8-8c70-430f-be21-2beae49943e1\") " pod="kube-system/kube-proxy-6p54l" Jan 15 02:00:50.577942 kubelet[2935]: I0115 02:00:50.577813 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46595ec8-8c70-430f-be21-2beae49943e1-lib-modules\") pod \"kube-proxy-6p54l\" (UID: \"46595ec8-8c70-430f-be21-2beae49943e1\") " pod="kube-system/kube-proxy-6p54l" Jan 15 02:00:50.577942 kubelet[2935]: I0115 02:00:50.577872 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfrpw\" (UniqueName: \"kubernetes.io/projected/46595ec8-8c70-430f-be21-2beae49943e1-kube-api-access-pfrpw\") pod \"kube-proxy-6p54l\" (UID: \"46595ec8-8c70-430f-be21-2beae49943e1\") " pod="kube-system/kube-proxy-6p54l" Jan 15 02:00:50.577942 kubelet[2935]: I0115 02:00:50.577931 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/46595ec8-8c70-430f-be21-2beae49943e1-xtables-lock\") pod \"kube-proxy-6p54l\" (UID: \"46595ec8-8c70-430f-be21-2beae49943e1\") " pod="kube-system/kube-proxy-6p54l" Jan 15 02:00:50.820149 systemd[1]: Created slice kubepods-besteffort-podf948374c_244b_49c7_b11f_37a1d5a95b75.slice - libcontainer container kubepods-besteffort-podf948374c_244b_49c7_b11f_37a1d5a95b75.slice. Jan 15 02:00:50.862768 containerd[1710]: time="2026-01-15T02:00:50.862723196Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-6p54l,Uid:46595ec8-8c70-430f-be21-2beae49943e1,Namespace:kube-system,Attempt:0,}" Jan 15 02:00:50.879766 kubelet[2935]: I0115 02:00:50.879648 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l46cx\" (UniqueName: \"kubernetes.io/projected/f948374c-244b-49c7-b11f-37a1d5a95b75-kube-api-access-l46cx\") pod \"tigera-operator-7dcd859c48-jm8bw\" (UID: \"f948374c-244b-49c7-b11f-37a1d5a95b75\") " pod="tigera-operator/tigera-operator-7dcd859c48-jm8bw" Jan 15 02:00:50.879766 kubelet[2935]: I0115 02:00:50.879707 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f948374c-244b-49c7-b11f-37a1d5a95b75-var-lib-calico\") pod \"tigera-operator-7dcd859c48-jm8bw\" (UID: \"f948374c-244b-49c7-b11f-37a1d5a95b75\") " pod="tigera-operator/tigera-operator-7dcd859c48-jm8bw" Jan 15 02:00:50.902369 containerd[1710]: time="2026-01-15T02:00:50.902322082Z" level=info msg="connecting to shim 13b670bd5fc4d31b0dc76d7eb6a1e8e8ac38ce3cb9c26ed4d2e0eb332e128b1c" address="unix:///run/containerd/s/fa910b0096564f829f96bef37b710e3c4c708d3268958c6e3b9ea8f13f58b548" namespace=k8s.io protocol=ttrpc version=3 Jan 15 02:00:50.935522 systemd[1]: Started cri-containerd-13b670bd5fc4d31b0dc76d7eb6a1e8e8ac38ce3cb9c26ed4d2e0eb332e128b1c.scope - libcontainer container 13b670bd5fc4d31b0dc76d7eb6a1e8e8ac38ce3cb9c26ed4d2e0eb332e128b1c. Jan 15 02:00:50.947000 audit: BPF prog-id=133 op=LOAD Jan 15 02:00:50.948690 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 15 02:00:50.948740 kernel: audit: type=1334 audit(1768442450.947:446): prog-id=133 op=LOAD Jan 15 02:00:50.950000 audit: BPF prog-id=134 op=LOAD Jan 15 02:00:50.950000 audit[2999]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=2987 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:50.955563 kernel: audit: type=1334 audit(1768442450.950:447): prog-id=134 op=LOAD Jan 15 02:00:50.955624 kernel: audit: type=1300 audit(1768442450.950:447): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=2987 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:50.950000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133623637306264356663346433316230646337366437656236613165 Jan 15 02:00:50.961212 kernel: audit: type=1327 audit(1768442450.950:447): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133623637306264356663346433316230646337366437656236613165 Jan 15 02:00:50.951000 audit: BPF prog-id=134 op=UNLOAD Jan 15 02:00:50.965448 kernel: audit: type=1334 audit(1768442450.951:448): prog-id=134 op=UNLOAD Jan 15 02:00:50.951000 audit[2999]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2987 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:50.968854 kernel: audit: type=1300 audit(1768442450.951:448): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2987 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:50.951000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133623637306264356663346433316230646337366437656236613165 Jan 15 02:00:50.951000 audit: BPF prog-id=135 op=LOAD Jan 15 02:00:50.980988 kernel: audit: type=1327 audit(1768442450.951:448): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133623637306264356663346433316230646337366437656236613165 Jan 15 02:00:50.981057 kernel: audit: type=1334 audit(1768442450.951:449): prog-id=135 op=LOAD Jan 15 02:00:50.981089 kernel: audit: type=1300 audit(1768442450.951:449): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=2987 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:50.951000 audit[2999]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=2987 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:50.951000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133623637306264356663346433316230646337366437656236613165 Jan 15 02:00:50.992194 kernel: audit: type=1327 audit(1768442450.951:449): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133623637306264356663346433316230646337366437656236613165 Jan 15 02:00:50.951000 audit: BPF prog-id=136 op=LOAD Jan 15 02:00:50.951000 audit[2999]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=2987 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:50.951000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133623637306264356663346433316230646337366437656236613165 Jan 15 02:00:50.951000 audit: BPF prog-id=136 op=UNLOAD Jan 15 02:00:50.951000 audit[2999]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2987 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:50.951000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133623637306264356663346433316230646337366437656236613165 Jan 15 02:00:50.951000 audit: BPF prog-id=135 op=UNLOAD Jan 15 02:00:50.951000 audit[2999]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2987 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:50.951000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133623637306264356663346433316230646337366437656236613165 Jan 15 02:00:50.951000 audit: BPF prog-id=137 op=LOAD Jan 15 02:00:50.951000 audit[2999]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=2987 pid=2999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:50.951000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133623637306264356663346433316230646337366437656236613165 Jan 15 02:00:50.997330 containerd[1710]: time="2026-01-15T02:00:50.997266011Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-6p54l,Uid:46595ec8-8c70-430f-be21-2beae49943e1,Namespace:kube-system,Attempt:0,} returns sandbox id \"13b670bd5fc4d31b0dc76d7eb6a1e8e8ac38ce3cb9c26ed4d2e0eb332e128b1c\"" Jan 15 02:00:51.004751 containerd[1710]: time="2026-01-15T02:00:51.004721183Z" level=info msg="CreateContainer within sandbox \"13b670bd5fc4d31b0dc76d7eb6a1e8e8ac38ce3cb9c26ed4d2e0eb332e128b1c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 15 02:00:51.023543 containerd[1710]: time="2026-01-15T02:00:51.022914074Z" level=info msg="Container 898d13b28cf1daec9ecf28524183fc33781b4fc88971c2a25de2d184357adcc9: CDI devices from CRI Config.CDIDevices: []" Jan 15 02:00:51.033819 containerd[1710]: time="2026-01-15T02:00:51.033800094Z" level=info msg="CreateContainer within sandbox \"13b670bd5fc4d31b0dc76d7eb6a1e8e8ac38ce3cb9c26ed4d2e0eb332e128b1c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"898d13b28cf1daec9ecf28524183fc33781b4fc88971c2a25de2d184357adcc9\"" Jan 15 02:00:51.034336 containerd[1710]: time="2026-01-15T02:00:51.034278307Z" level=info msg="StartContainer for \"898d13b28cf1daec9ecf28524183fc33781b4fc88971c2a25de2d184357adcc9\"" Jan 15 02:00:51.035541 containerd[1710]: time="2026-01-15T02:00:51.035523400Z" level=info msg="connecting to shim 898d13b28cf1daec9ecf28524183fc33781b4fc88971c2a25de2d184357adcc9" address="unix:///run/containerd/s/fa910b0096564f829f96bef37b710e3c4c708d3268958c6e3b9ea8f13f58b548" protocol=ttrpc version=3 Jan 15 02:00:51.053316 systemd[1]: Started cri-containerd-898d13b28cf1daec9ecf28524183fc33781b4fc88971c2a25de2d184357adcc9.scope - libcontainer container 898d13b28cf1daec9ecf28524183fc33781b4fc88971c2a25de2d184357adcc9. Jan 15 02:00:51.113000 audit: BPF prog-id=138 op=LOAD Jan 15 02:00:51.113000 audit[3025]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=2987 pid=3025 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.113000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839386431336232386366316461656339656366323835323431383366 Jan 15 02:00:51.113000 audit: BPF prog-id=139 op=LOAD Jan 15 02:00:51.113000 audit[3025]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=2987 pid=3025 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.113000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839386431336232386366316461656339656366323835323431383366 Jan 15 02:00:51.113000 audit: BPF prog-id=139 op=UNLOAD Jan 15 02:00:51.113000 audit[3025]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2987 pid=3025 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.113000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839386431336232386366316461656339656366323835323431383366 Jan 15 02:00:51.113000 audit: BPF prog-id=138 op=UNLOAD Jan 15 02:00:51.113000 audit[3025]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2987 pid=3025 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.113000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839386431336232386366316461656339656366323835323431383366 Jan 15 02:00:51.113000 audit: BPF prog-id=140 op=LOAD Jan 15 02:00:51.113000 audit[3025]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001786e8 a2=98 a3=0 items=0 ppid=2987 pid=3025 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.113000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3839386431336232386366316461656339656366323835323431383366 Jan 15 02:00:51.126816 containerd[1710]: time="2026-01-15T02:00:51.126716246Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-jm8bw,Uid:f948374c-244b-49c7-b11f-37a1d5a95b75,Namespace:tigera-operator,Attempt:0,}" Jan 15 02:00:51.143607 containerd[1710]: time="2026-01-15T02:00:51.143577974Z" level=info msg="StartContainer for \"898d13b28cf1daec9ecf28524183fc33781b4fc88971c2a25de2d184357adcc9\" returns successfully" Jan 15 02:00:51.164343 containerd[1710]: time="2026-01-15T02:00:51.164301846Z" level=info msg="connecting to shim c46388475d85c616673a40d94e9d78da7bdfb4a3ce9904e5f691b1233245deae" address="unix:///run/containerd/s/11d07c69f2310074ab452a1f098243831d6e7ef9d34d938c01583d401c7bb56d" namespace=k8s.io protocol=ttrpc version=3 Jan 15 02:00:51.191427 systemd[1]: Started cri-containerd-c46388475d85c616673a40d94e9d78da7bdfb4a3ce9904e5f691b1233245deae.scope - libcontainer container c46388475d85c616673a40d94e9d78da7bdfb4a3ce9904e5f691b1233245deae. Jan 15 02:00:51.209000 audit: BPF prog-id=141 op=LOAD Jan 15 02:00:51.210000 audit: BPF prog-id=142 op=LOAD Jan 15 02:00:51.210000 audit[3076]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=3065 pid=3076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.210000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334363338383437356438356336313636373361343064393465396437 Jan 15 02:00:51.211000 audit: BPF prog-id=142 op=UNLOAD Jan 15 02:00:51.211000 audit[3076]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3065 pid=3076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.211000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334363338383437356438356336313636373361343064393465396437 Jan 15 02:00:51.211000 audit: BPF prog-id=143 op=LOAD Jan 15 02:00:51.211000 audit[3076]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3065 pid=3076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.211000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334363338383437356438356336313636373361343064393465396437 Jan 15 02:00:51.211000 audit: BPF prog-id=144 op=LOAD Jan 15 02:00:51.211000 audit[3076]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3065 pid=3076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.211000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334363338383437356438356336313636373361343064393465396437 Jan 15 02:00:51.211000 audit: BPF prog-id=144 op=UNLOAD Jan 15 02:00:51.211000 audit[3076]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3065 pid=3076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.211000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334363338383437356438356336313636373361343064393465396437 Jan 15 02:00:51.211000 audit: BPF prog-id=143 op=UNLOAD Jan 15 02:00:51.211000 audit[3076]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3065 pid=3076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.211000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334363338383437356438356336313636373361343064393465396437 Jan 15 02:00:51.211000 audit: BPF prog-id=145 op=LOAD Jan 15 02:00:51.211000 audit[3076]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=3065 pid=3076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.211000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334363338383437356438356336313636373361343064393465396437 Jan 15 02:00:51.251400 containerd[1710]: time="2026-01-15T02:00:51.251346157Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-jm8bw,Uid:f948374c-244b-49c7-b11f-37a1d5a95b75,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"c46388475d85c616673a40d94e9d78da7bdfb4a3ce9904e5f691b1233245deae\"" Jan 15 02:00:51.253642 containerd[1710]: time="2026-01-15T02:00:51.253623353Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 15 02:00:51.275000 audit[3133]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3133 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.275000 audit[3133]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc534ea5b0 a2=0 a3=7ffc534ea59c items=0 ppid=3037 pid=3133 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.275000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 15 02:00:51.278000 audit[3134]: NETFILTER_CFG table=mangle:55 family=10 entries=1 op=nft_register_chain pid=3134 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.278000 audit[3134]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff96d904c0 a2=0 a3=7fff96d904ac items=0 ppid=3037 pid=3134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.278000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 15 02:00:51.278000 audit[3136]: NETFILTER_CFG table=nat:56 family=2 entries=1 op=nft_register_chain pid=3136 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.278000 audit[3136]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc00fa0860 a2=0 a3=7ffc00fa084c items=0 ppid=3037 pid=3136 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.278000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 15 02:00:51.279000 audit[3137]: NETFILTER_CFG table=filter:57 family=2 entries=1 op=nft_register_chain pid=3137 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.279000 audit[3137]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcf6fb9600 a2=0 a3=7ffcf6fb95ec items=0 ppid=3037 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.279000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 15 02:00:51.279000 audit[3138]: NETFILTER_CFG table=nat:58 family=10 entries=1 op=nft_register_chain pid=3138 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.279000 audit[3138]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe70e584b0 a2=0 a3=7ffe70e5849c items=0 ppid=3037 pid=3138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.279000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 15 02:00:51.283000 audit[3139]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3139 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.283000 audit[3139]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd721c38b0 a2=0 a3=7ffd721c389c items=0 ppid=3037 pid=3139 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.283000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 15 02:00:51.385000 audit[3140]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3140 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.385000 audit[3140]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffeb03a3fe0 a2=0 a3=7ffeb03a3fcc items=0 ppid=3037 pid=3140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.385000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 15 02:00:51.393000 audit[3142]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3142 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.393000 audit[3142]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffc7d665b20 a2=0 a3=7ffc7d665b0c items=0 ppid=3037 pid=3142 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.393000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 15 02:00:51.404000 audit[3145]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3145 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.404000 audit[3145]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7fff41932cb0 a2=0 a3=7fff41932c9c items=0 ppid=3037 pid=3145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.404000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 15 02:00:51.407000 audit[3146]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3146 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.407000 audit[3146]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcbd9cc540 a2=0 a3=7ffcbd9cc52c items=0 ppid=3037 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.407000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 15 02:00:51.413000 audit[3148]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3148 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.413000 audit[3148]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fff9680cae0 a2=0 a3=7fff9680cacc items=0 ppid=3037 pid=3148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.413000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 15 02:00:51.416000 audit[3149]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3149 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.416000 audit[3149]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd7dcc5490 a2=0 a3=7ffd7dcc547c items=0 ppid=3037 pid=3149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.416000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 15 02:00:51.422000 audit[3151]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3151 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.422000 audit[3151]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffee859fa60 a2=0 a3=7ffee859fa4c items=0 ppid=3037 pid=3151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.422000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 15 02:00:51.433000 audit[3154]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3154 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.433000 audit[3154]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffc0e3a5d40 a2=0 a3=7ffc0e3a5d2c items=0 ppid=3037 pid=3154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.433000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 15 02:00:51.435000 audit[3155]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3155 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.435000 audit[3155]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffebd20a830 a2=0 a3=7ffebd20a81c items=0 ppid=3037 pid=3155 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.435000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 15 02:00:51.441000 audit[3157]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3157 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.441000 audit[3157]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe10286680 a2=0 a3=7ffe1028666c items=0 ppid=3037 pid=3157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.441000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 15 02:00:51.443000 audit[3158]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3158 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.443000 audit[3158]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffb2631e70 a2=0 a3=7fffb2631e5c items=0 ppid=3037 pid=3158 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.443000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 15 02:00:51.449000 audit[3160]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3160 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.449000 audit[3160]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffdde11cc70 a2=0 a3=7ffdde11cc5c items=0 ppid=3037 pid=3160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.449000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 15 02:00:51.455000 audit[3163]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3163 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.455000 audit[3163]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffdbceaa120 a2=0 a3=7ffdbceaa10c items=0 ppid=3037 pid=3163 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.455000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 15 02:00:51.461000 audit[3166]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3166 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.461000 audit[3166]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fffb5eed7b0 a2=0 a3=7fffb5eed79c items=0 ppid=3037 pid=3166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.461000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 15 02:00:51.463000 audit[3167]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3167 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.463000 audit[3167]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fffb81b77e0 a2=0 a3=7fffb81b77cc items=0 ppid=3037 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.463000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 15 02:00:51.467000 audit[3169]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3169 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.467000 audit[3169]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7fff5ee90190 a2=0 a3=7fff5ee9017c items=0 ppid=3037 pid=3169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.467000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 15 02:00:51.474000 audit[3172]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3172 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.474000 audit[3172]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd5ae83640 a2=0 a3=7ffd5ae8362c items=0 ppid=3037 pid=3172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.474000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 15 02:00:51.477000 audit[3173]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3173 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.477000 audit[3173]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcc75b8140 a2=0 a3=7ffcc75b812c items=0 ppid=3037 pid=3173 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.477000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 15 02:00:51.483000 audit[3175]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3175 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 15 02:00:51.483000 audit[3175]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7fff0c194170 a2=0 a3=7fff0c19415c items=0 ppid=3037 pid=3175 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.483000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 15 02:00:51.536000 audit[3181]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3181 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:00:51.536000 audit[3181]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7fff869102e0 a2=0 a3=7fff869102cc items=0 ppid=3037 pid=3181 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.536000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:00:51.547000 audit[3181]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3181 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:00:51.547000 audit[3181]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7fff869102e0 a2=0 a3=7fff869102cc items=0 ppid=3037 pid=3181 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.547000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:00:51.549000 audit[3186]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3186 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.549000 audit[3186]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffe8190cd20 a2=0 a3=7ffe8190cd0c items=0 ppid=3037 pid=3186 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.549000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 15 02:00:51.556000 audit[3188]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3188 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.556000 audit[3188]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffcccce6e70 a2=0 a3=7ffcccce6e5c items=0 ppid=3037 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.556000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 15 02:00:51.564000 audit[3191]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3191 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.564000 audit[3191]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffe0d2a39e0 a2=0 a3=7ffe0d2a39cc items=0 ppid=3037 pid=3191 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.564000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 15 02:00:51.567000 audit[3192]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3192 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.567000 audit[3192]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff989ff040 a2=0 a3=7fff989ff02c items=0 ppid=3037 pid=3192 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.567000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 15 02:00:51.573000 audit[3194]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3194 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.573000 audit[3194]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd21ca4430 a2=0 a3=7ffd21ca441c items=0 ppid=3037 pid=3194 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.573000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 15 02:00:51.575000 audit[3195]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3195 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.575000 audit[3195]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffc96f92a0 a2=0 a3=7fffc96f928c items=0 ppid=3037 pid=3195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.575000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 15 02:00:51.580000 audit[3197]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3197 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.580000 audit[3197]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffd08a283e0 a2=0 a3=7ffd08a283cc items=0 ppid=3037 pid=3197 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.580000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 15 02:00:51.589000 audit[3200]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3200 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.589000 audit[3200]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7ffcd915bc60 a2=0 a3=7ffcd915bc4c items=0 ppid=3037 pid=3200 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.589000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 15 02:00:51.595000 audit[3201]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3201 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.595000 audit[3201]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcd5bf1bd0 a2=0 a3=7ffcd5bf1bbc items=0 ppid=3037 pid=3201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.595000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 15 02:00:51.600000 audit[3203]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3203 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.600000 audit[3203]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc138056a0 a2=0 a3=7ffc1380568c items=0 ppid=3037 pid=3203 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.600000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 15 02:00:51.603000 audit[3204]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3204 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.603000 audit[3204]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe91db1a80 a2=0 a3=7ffe91db1a6c items=0 ppid=3037 pid=3204 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.603000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 15 02:00:51.610000 audit[3206]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3206 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.610000 audit[3206]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff94281360 a2=0 a3=7fff9428134c items=0 ppid=3037 pid=3206 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.610000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 15 02:00:51.617000 audit[3209]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3209 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.617000 audit[3209]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff1e149a60 a2=0 a3=7fff1e149a4c items=0 ppid=3037 pid=3209 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.617000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 15 02:00:51.623000 audit[3212]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3212 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.623000 audit[3212]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7fff81c9f680 a2=0 a3=7fff81c9f66c items=0 ppid=3037 pid=3212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.623000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 15 02:00:51.625000 audit[3213]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3213 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.625000 audit[3213]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fffdc54f030 a2=0 a3=7fffdc54f01c items=0 ppid=3037 pid=3213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.625000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 15 02:00:51.628000 audit[3215]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3215 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.628000 audit[3215]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7fffde4ac9e0 a2=0 a3=7fffde4ac9cc items=0 ppid=3037 pid=3215 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.628000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 15 02:00:51.633000 audit[3218]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3218 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.633000 audit[3218]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffc03549120 a2=0 a3=7ffc0354910c items=0 ppid=3037 pid=3218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.633000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 15 02:00:51.635000 audit[3219]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3219 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.635000 audit[3219]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc4d466670 a2=0 a3=7ffc4d46665c items=0 ppid=3037 pid=3219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.635000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 15 02:00:51.638000 audit[3221]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3221 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.638000 audit[3221]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffcbce13100 a2=0 a3=7ffcbce130ec items=0 ppid=3037 pid=3221 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.638000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 15 02:00:51.640000 audit[3222]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3222 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.640000 audit[3222]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff8f7ae530 a2=0 a3=7fff8f7ae51c items=0 ppid=3037 pid=3222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.640000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 15 02:00:51.643000 audit[3224]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3224 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.643000 audit[3224]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffd32bc19e0 a2=0 a3=7ffd32bc19cc items=0 ppid=3037 pid=3224 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.643000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 15 02:00:51.647000 audit[3227]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3227 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 15 02:00:51.647000 audit[3227]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffc2ecadaf0 a2=0 a3=7ffc2ecadadc items=0 ppid=3037 pid=3227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.647000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 15 02:00:51.650000 audit[3229]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3229 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 15 02:00:51.650000 audit[3229]: SYSCALL arch=c000003e syscall=46 success=yes exit=2088 a0=3 a1=7fff23ecb640 a2=0 a3=7fff23ecb62c items=0 ppid=3037 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.650000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:00:51.651000 audit[3229]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3229 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 15 02:00:51.651000 audit[3229]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7fff23ecb640 a2=0 a3=7fff23ecb62c items=0 ppid=3037 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:00:51.651000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:00:51.855797 kubelet[2935]: I0115 02:00:51.855704 2935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-6p54l" podStartSLOduration=1.855672607 podStartE2EDuration="1.855672607s" podCreationTimestamp="2026-01-15 02:00:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 02:00:51.840354328 +0000 UTC m=+8.218346245" watchObservedRunningTime="2026-01-15 02:00:51.855672607 +0000 UTC m=+8.233664551" Jan 15 02:00:53.121360 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3707622682.mount: Deactivated successfully. Jan 15 02:01:02.736952 containerd[1710]: time="2026-01-15T02:01:02.736888222Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:01:02.738084 containerd[1710]: time="2026-01-15T02:01:02.738061977Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=23558205" Jan 15 02:01:02.739885 containerd[1710]: time="2026-01-15T02:01:02.739841180Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:01:02.743041 containerd[1710]: time="2026-01-15T02:01:02.742538195Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:01:02.743041 containerd[1710]: time="2026-01-15T02:01:02.742942434Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 11.48921293s" Jan 15 02:01:02.743041 containerd[1710]: time="2026-01-15T02:01:02.742966635Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Jan 15 02:01:02.745877 containerd[1710]: time="2026-01-15T02:01:02.745855377Z" level=info msg="CreateContainer within sandbox \"c46388475d85c616673a40d94e9d78da7bdfb4a3ce9904e5f691b1233245deae\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 15 02:01:02.757918 containerd[1710]: time="2026-01-15T02:01:02.757501142Z" level=info msg="Container 020c9357cf6ad264923341d0c768503934c29ff199090781cd2c51b1b562d146: CDI devices from CRI Config.CDIDevices: []" Jan 15 02:01:02.759924 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2697883417.mount: Deactivated successfully. Jan 15 02:01:02.769070 containerd[1710]: time="2026-01-15T02:01:02.769045146Z" level=info msg="CreateContainer within sandbox \"c46388475d85c616673a40d94e9d78da7bdfb4a3ce9904e5f691b1233245deae\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"020c9357cf6ad264923341d0c768503934c29ff199090781cd2c51b1b562d146\"" Jan 15 02:01:02.770308 containerd[1710]: time="2026-01-15T02:01:02.770251783Z" level=info msg="StartContainer for \"020c9357cf6ad264923341d0c768503934c29ff199090781cd2c51b1b562d146\"" Jan 15 02:01:02.771106 containerd[1710]: time="2026-01-15T02:01:02.771075349Z" level=info msg="connecting to shim 020c9357cf6ad264923341d0c768503934c29ff199090781cd2c51b1b562d146" address="unix:///run/containerd/s/11d07c69f2310074ab452a1f098243831d6e7ef9d34d938c01583d401c7bb56d" protocol=ttrpc version=3 Jan 15 02:01:02.789390 systemd[1]: Started cri-containerd-020c9357cf6ad264923341d0c768503934c29ff199090781cd2c51b1b562d146.scope - libcontainer container 020c9357cf6ad264923341d0c768503934c29ff199090781cd2c51b1b562d146. Jan 15 02:01:02.797000 audit: BPF prog-id=146 op=LOAD Jan 15 02:01:02.799490 kernel: kauditd_printk_skb: 202 callbacks suppressed Jan 15 02:01:02.799529 kernel: audit: type=1334 audit(1768442462.797:518): prog-id=146 op=LOAD Jan 15 02:01:02.799000 audit: BPF prog-id=147 op=LOAD Jan 15 02:01:02.799000 audit[3240]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=3065 pid=3240 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:02.804569 kernel: audit: type=1334 audit(1768442462.799:519): prog-id=147 op=LOAD Jan 15 02:01:02.804616 kernel: audit: type=1300 audit(1768442462.799:519): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138238 a2=98 a3=0 items=0 ppid=3065 pid=3240 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:02.807206 kernel: audit: type=1327 audit(1768442462.799:519): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032306339333537636636616432363439323333343164306337363835 Jan 15 02:01:02.799000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032306339333537636636616432363439323333343164306337363835 Jan 15 02:01:02.799000 audit: BPF prog-id=147 op=UNLOAD Jan 15 02:01:02.811600 kernel: audit: type=1334 audit(1768442462.799:520): prog-id=147 op=UNLOAD Jan 15 02:01:02.799000 audit[3240]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3065 pid=3240 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:02.814221 kernel: audit: type=1300 audit(1768442462.799:520): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3065 pid=3240 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:02.799000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032306339333537636636616432363439323333343164306337363835 Jan 15 02:01:02.818171 kernel: audit: type=1327 audit(1768442462.799:520): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032306339333537636636616432363439323333343164306337363835 Jan 15 02:01:02.799000 audit: BPF prog-id=148 op=LOAD Jan 15 02:01:02.822087 kernel: audit: type=1334 audit(1768442462.799:521): prog-id=148 op=LOAD Jan 15 02:01:02.799000 audit[3240]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=3065 pid=3240 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:02.799000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032306339333537636636616432363439323333343164306337363835 Jan 15 02:01:02.827894 kernel: audit: type=1300 audit(1768442462.799:521): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=3065 pid=3240 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:02.827944 kernel: audit: type=1327 audit(1768442462.799:521): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032306339333537636636616432363439323333343164306337363835 Jan 15 02:01:02.799000 audit: BPF prog-id=149 op=LOAD Jan 15 02:01:02.799000 audit[3240]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=3065 pid=3240 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:02.799000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032306339333537636636616432363439323333343164306337363835 Jan 15 02:01:02.799000 audit: BPF prog-id=149 op=UNLOAD Jan 15 02:01:02.799000 audit[3240]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3065 pid=3240 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:02.799000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032306339333537636636616432363439323333343164306337363835 Jan 15 02:01:02.799000 audit: BPF prog-id=148 op=UNLOAD Jan 15 02:01:02.799000 audit[3240]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3065 pid=3240 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:02.799000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032306339333537636636616432363439323333343164306337363835 Jan 15 02:01:02.800000 audit: BPF prog-id=150 op=LOAD Jan 15 02:01:02.800000 audit[3240]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=3065 pid=3240 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:02.800000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032306339333537636636616432363439323333343164306337363835 Jan 15 02:01:02.837861 containerd[1710]: time="2026-01-15T02:01:02.837808629Z" level=info msg="StartContainer for \"020c9357cf6ad264923341d0c768503934c29ff199090781cd2c51b1b562d146\" returns successfully" Jan 15 02:01:02.856191 kubelet[2935]: I0115 02:01:02.856044 2935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-jm8bw" podStartSLOduration=1.365542089 podStartE2EDuration="12.856032858s" podCreationTimestamp="2026-01-15 02:00:50 +0000 UTC" firstStartedPulling="2026-01-15 02:00:51.253125162 +0000 UTC m=+7.631116980" lastFinishedPulling="2026-01-15 02:01:02.743615929 +0000 UTC m=+19.121607749" observedRunningTime="2026-01-15 02:01:02.855814865 +0000 UTC m=+19.233806707" watchObservedRunningTime="2026-01-15 02:01:02.856032858 +0000 UTC m=+19.234024712" Jan 15 02:01:08.486574 sudo[1984]: pam_unix(sudo:session): session closed for user root Jan 15 02:01:08.491045 kernel: kauditd_printk_skb: 12 callbacks suppressed Jan 15 02:01:08.491144 kernel: audit: type=1106 audit(1768442468.485:526): pid=1984 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 02:01:08.485000 audit[1984]: USER_END pid=1984 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 02:01:08.485000 audit[1984]: CRED_DISP pid=1984 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 02:01:08.492856 kernel: audit: type=1104 audit(1768442468.485:527): pid=1984 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 15 02:01:08.586359 sshd[1971]: Connection closed by 4.153.228.146 port 36224 Jan 15 02:01:08.587303 sshd-session[1965]: pam_unix(sshd:session): session closed for user core Jan 15 02:01:08.587000 audit[1965]: USER_END pid=1965 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:01:08.592319 systemd[1]: sshd@8-10.0.1.164:22-4.153.228.146:36224.service: Deactivated successfully. Jan 15 02:01:08.594437 kernel: audit: type=1106 audit(1768442468.587:528): pid=1965 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:01:08.596770 systemd[1]: session-9.scope: Deactivated successfully. Jan 15 02:01:08.598037 systemd[1]: session-9.scope: Consumed 6.739s CPU time, 230.5M memory peak. Jan 15 02:01:08.587000 audit[1965]: CRED_DISP pid=1965 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:01:08.602685 systemd-logind[1687]: Session 9 logged out. Waiting for processes to exit. Jan 15 02:01:08.603426 kernel: audit: type=1104 audit(1768442468.587:529): pid=1965 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:01:08.604252 systemd-logind[1687]: Removed session 9. Jan 15 02:01:08.590000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.1.164:22-4.153.228.146:36224 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:01:08.610215 kernel: audit: type=1131 audit(1768442468.590:530): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.1.164:22-4.153.228.146:36224 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:01:09.333000 audit[3321]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3321 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:09.339436 kernel: audit: type=1325 audit(1768442469.333:531): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3321 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:09.333000 audit[3321]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffcd5187410 a2=0 a3=7ffcd51873fc items=0 ppid=3037 pid=3321 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:09.344173 kernel: audit: type=1300 audit(1768442469.333:531): arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7ffcd5187410 a2=0 a3=7ffcd51873fc items=0 ppid=3037 pid=3321 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:09.333000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:09.348176 kernel: audit: type=1327 audit(1768442469.333:531): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:09.343000 audit[3321]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3321 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:09.351363 kernel: audit: type=1325 audit(1768442469.343:532): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3321 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:09.343000 audit[3321]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffcd5187410 a2=0 a3=0 items=0 ppid=3037 pid=3321 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:09.343000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:09.359184 kernel: audit: type=1300 audit(1768442469.343:532): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffcd5187410 a2=0 a3=0 items=0 ppid=3037 pid=3321 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:09.358000 audit[3323]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:09.358000 audit[3323]: SYSCALL arch=c000003e syscall=46 success=yes exit=5992 a0=3 a1=7fff5a00c150 a2=0 a3=7fff5a00c13c items=0 ppid=3037 pid=3323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:09.358000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:09.362000 audit[3323]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:09.362000 audit[3323]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff5a00c150 a2=0 a3=0 items=0 ppid=3037 pid=3323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:09.362000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:11.194000 audit[3325]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3325 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:11.194000 audit[3325]: SYSCALL arch=c000003e syscall=46 success=yes exit=6736 a0=3 a1=7ffcf91abb60 a2=0 a3=7ffcf91abb4c items=0 ppid=3037 pid=3325 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:11.194000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:11.199000 audit[3325]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3325 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:11.199000 audit[3325]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffcf91abb60 a2=0 a3=0 items=0 ppid=3037 pid=3325 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:11.199000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:12.236000 audit[3327]: NETFILTER_CFG table=filter:111 family=2 entries=19 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:12.236000 audit[3327]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffcf5f23850 a2=0 a3=7ffcf5f2383c items=0 ppid=3037 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:12.236000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:12.240000 audit[3327]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:12.240000 audit[3327]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffcf5f23850 a2=0 a3=0 items=0 ppid=3037 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:12.240000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:12.749000 audit[3329]: NETFILTER_CFG table=filter:113 family=2 entries=21 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:12.749000 audit[3329]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffc14c78e80 a2=0 a3=7ffc14c78e6c items=0 ppid=3037 pid=3329 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:12.749000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:12.752000 audit[3329]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:12.752000 audit[3329]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc14c78e80 a2=0 a3=0 items=0 ppid=3037 pid=3329 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:12.752000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:12.788287 systemd[1]: Created slice kubepods-besteffort-pod59eaf0bf_ed1f_4ebd_bf5d_3a45ac0ccc93.slice - libcontainer container kubepods-besteffort-pod59eaf0bf_ed1f_4ebd_bf5d_3a45ac0ccc93.slice. Jan 15 02:01:12.820928 kubelet[2935]: I0115 02:01:12.820902 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/59eaf0bf-ed1f-4ebd-bf5d-3a45ac0ccc93-typha-certs\") pod \"calico-typha-c5b6dfdcd-dqxw4\" (UID: \"59eaf0bf-ed1f-4ebd-bf5d-3a45ac0ccc93\") " pod="calico-system/calico-typha-c5b6dfdcd-dqxw4" Jan 15 02:01:12.821233 kubelet[2935]: I0115 02:01:12.820934 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b4dp\" (UniqueName: \"kubernetes.io/projected/59eaf0bf-ed1f-4ebd-bf5d-3a45ac0ccc93-kube-api-access-9b4dp\") pod \"calico-typha-c5b6dfdcd-dqxw4\" (UID: \"59eaf0bf-ed1f-4ebd-bf5d-3a45ac0ccc93\") " pod="calico-system/calico-typha-c5b6dfdcd-dqxw4" Jan 15 02:01:12.821233 kubelet[2935]: I0115 02:01:12.820958 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59eaf0bf-ed1f-4ebd-bf5d-3a45ac0ccc93-tigera-ca-bundle\") pod \"calico-typha-c5b6dfdcd-dqxw4\" (UID: \"59eaf0bf-ed1f-4ebd-bf5d-3a45ac0ccc93\") " pod="calico-system/calico-typha-c5b6dfdcd-dqxw4" Jan 15 02:01:12.910398 systemd[1]: Created slice kubepods-besteffort-podfe38116a_5a64_40d4_9a99_99055821c317.slice - libcontainer container kubepods-besteffort-podfe38116a_5a64_40d4_9a99_99055821c317.slice. Jan 15 02:01:12.922032 kubelet[2935]: I0115 02:01:12.922004 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/fe38116a-5a64-40d4-9a99-99055821c317-policysync\") pod \"calico-node-klwkl\" (UID: \"fe38116a-5a64-40d4-9a99-99055821c317\") " pod="calico-system/calico-node-klwkl" Jan 15 02:01:12.922141 kubelet[2935]: I0115 02:01:12.922038 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/fe38116a-5a64-40d4-9a99-99055821c317-flexvol-driver-host\") pod \"calico-node-klwkl\" (UID: \"fe38116a-5a64-40d4-9a99-99055821c317\") " pod="calico-system/calico-node-klwkl" Jan 15 02:01:12.922141 kubelet[2935]: I0115 02:01:12.922059 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/fe38116a-5a64-40d4-9a99-99055821c317-cni-bin-dir\") pod \"calico-node-klwkl\" (UID: \"fe38116a-5a64-40d4-9a99-99055821c317\") " pod="calico-system/calico-node-klwkl" Jan 15 02:01:12.922141 kubelet[2935]: I0115 02:01:12.922075 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe38116a-5a64-40d4-9a99-99055821c317-lib-modules\") pod \"calico-node-klwkl\" (UID: \"fe38116a-5a64-40d4-9a99-99055821c317\") " pod="calico-system/calico-node-klwkl" Jan 15 02:01:12.922141 kubelet[2935]: I0115 02:01:12.922090 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/fe38116a-5a64-40d4-9a99-99055821c317-xtables-lock\") pod \"calico-node-klwkl\" (UID: \"fe38116a-5a64-40d4-9a99-99055821c317\") " pod="calico-system/calico-node-klwkl" Jan 15 02:01:12.922141 kubelet[2935]: I0115 02:01:12.922130 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/fe38116a-5a64-40d4-9a99-99055821c317-var-lib-calico\") pod \"calico-node-klwkl\" (UID: \"fe38116a-5a64-40d4-9a99-99055821c317\") " pod="calico-system/calico-node-klwkl" Jan 15 02:01:12.923093 kubelet[2935]: I0115 02:01:12.922145 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/fe38116a-5a64-40d4-9a99-99055821c317-var-run-calico\") pod \"calico-node-klwkl\" (UID: \"fe38116a-5a64-40d4-9a99-99055821c317\") " pod="calico-system/calico-node-klwkl" Jan 15 02:01:12.923093 kubelet[2935]: I0115 02:01:12.922172 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe38116a-5a64-40d4-9a99-99055821c317-tigera-ca-bundle\") pod \"calico-node-klwkl\" (UID: \"fe38116a-5a64-40d4-9a99-99055821c317\") " pod="calico-system/calico-node-klwkl" Jan 15 02:01:12.923093 kubelet[2935]: I0115 02:01:12.922187 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/fe38116a-5a64-40d4-9a99-99055821c317-cni-log-dir\") pod \"calico-node-klwkl\" (UID: \"fe38116a-5a64-40d4-9a99-99055821c317\") " pod="calico-system/calico-node-klwkl" Jan 15 02:01:12.923093 kubelet[2935]: I0115 02:01:12.922211 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/fe38116a-5a64-40d4-9a99-99055821c317-cni-net-dir\") pod \"calico-node-klwkl\" (UID: \"fe38116a-5a64-40d4-9a99-99055821c317\") " pod="calico-system/calico-node-klwkl" Jan 15 02:01:12.923093 kubelet[2935]: I0115 02:01:12.922227 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25xfq\" (UniqueName: \"kubernetes.io/projected/fe38116a-5a64-40d4-9a99-99055821c317-kube-api-access-25xfq\") pod \"calico-node-klwkl\" (UID: \"fe38116a-5a64-40d4-9a99-99055821c317\") " pod="calico-system/calico-node-klwkl" Jan 15 02:01:12.923288 kubelet[2935]: I0115 02:01:12.922262 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/fe38116a-5a64-40d4-9a99-99055821c317-node-certs\") pod \"calico-node-klwkl\" (UID: \"fe38116a-5a64-40d4-9a99-99055821c317\") " pod="calico-system/calico-node-klwkl" Jan 15 02:01:13.036126 kubelet[2935]: E0115 02:01:13.035700 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.036126 kubelet[2935]: W0115 02:01:13.035734 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.036126 kubelet[2935]: E0115 02:01:13.035786 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.050720 kubelet[2935]: E0115 02:01:13.050464 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.050720 kubelet[2935]: W0115 02:01:13.050489 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.050720 kubelet[2935]: E0115 02:01:13.050516 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.094634 containerd[1710]: time="2026-01-15T02:01:13.094574578Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-c5b6dfdcd-dqxw4,Uid:59eaf0bf-ed1f-4ebd-bf5d-3a45ac0ccc93,Namespace:calico-system,Attempt:0,}" Jan 15 02:01:13.103846 kubelet[2935]: E0115 02:01:13.103374 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:01:13.136474 containerd[1710]: time="2026-01-15T02:01:13.136445989Z" level=info msg="connecting to shim c83e1bd24bcc10d3ff918dc1b05da1ce43d38f0ef1e94e6ee457537e6ef7ac17" address="unix:///run/containerd/s/89363750e11c1ae43c0acf87e48566f7cf194a4123e99cc64bd2ff6edea4215a" namespace=k8s.io protocol=ttrpc version=3 Jan 15 02:01:13.163381 systemd[1]: Started cri-containerd-c83e1bd24bcc10d3ff918dc1b05da1ce43d38f0ef1e94e6ee457537e6ef7ac17.scope - libcontainer container c83e1bd24bcc10d3ff918dc1b05da1ce43d38f0ef1e94e6ee457537e6ef7ac17. Jan 15 02:01:13.182000 audit: BPF prog-id=151 op=LOAD Jan 15 02:01:13.182000 audit: BPF prog-id=152 op=LOAD Jan 15 02:01:13.182000 audit[3367]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178238 a2=98 a3=0 items=0 ppid=3356 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.182000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338336531626432346263633130643366663931386463316230356461 Jan 15 02:01:13.183000 audit: BPF prog-id=152 op=UNLOAD Jan 15 02:01:13.183000 audit[3367]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3356 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.183000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338336531626432346263633130643366663931386463316230356461 Jan 15 02:01:13.183000 audit: BPF prog-id=153 op=LOAD Jan 15 02:01:13.183000 audit[3367]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=3356 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.183000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338336531626432346263633130643366663931386463316230356461 Jan 15 02:01:13.183000 audit: BPF prog-id=154 op=LOAD Jan 15 02:01:13.183000 audit[3367]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=3356 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.183000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338336531626432346263633130643366663931386463316230356461 Jan 15 02:01:13.183000 audit: BPF prog-id=154 op=UNLOAD Jan 15 02:01:13.183000 audit[3367]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3356 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.183000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338336531626432346263633130643366663931386463316230356461 Jan 15 02:01:13.183000 audit: BPF prog-id=153 op=UNLOAD Jan 15 02:01:13.183000 audit[3367]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3356 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.183000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338336531626432346263633130643366663931386463316230356461 Jan 15 02:01:13.183000 audit: BPF prog-id=155 op=LOAD Jan 15 02:01:13.183000 audit[3367]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001786e8 a2=98 a3=0 items=0 ppid=3356 pid=3367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.183000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338336531626432346263633130643366663931386463316230356461 Jan 15 02:01:13.202545 kubelet[2935]: E0115 02:01:13.202528 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.202645 kubelet[2935]: W0115 02:01:13.202633 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.202703 kubelet[2935]: E0115 02:01:13.202693 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.202924 kubelet[2935]: E0115 02:01:13.202916 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.203001 kubelet[2935]: W0115 02:01:13.202962 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.203001 kubelet[2935]: E0115 02:01:13.202973 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.203226 kubelet[2935]: E0115 02:01:13.203218 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.203325 kubelet[2935]: W0115 02:01:13.203279 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.203325 kubelet[2935]: E0115 02:01:13.203289 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.203846 kubelet[2935]: E0115 02:01:13.203655 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.203846 kubelet[2935]: W0115 02:01:13.203665 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.203846 kubelet[2935]: E0115 02:01:13.203674 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.204397 kubelet[2935]: E0115 02:01:13.204192 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.204397 kubelet[2935]: W0115 02:01:13.204202 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.204397 kubelet[2935]: E0115 02:01:13.204213 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.205013 kubelet[2935]: E0115 02:01:13.204929 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.205013 kubelet[2935]: W0115 02:01:13.204946 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.205013 kubelet[2935]: E0115 02:01:13.204956 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.206187 kubelet[2935]: E0115 02:01:13.206173 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.206288 kubelet[2935]: W0115 02:01:13.206240 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.206288 kubelet[2935]: E0115 02:01:13.206252 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.206543 kubelet[2935]: E0115 02:01:13.206534 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.206644 kubelet[2935]: W0115 02:01:13.206596 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.206644 kubelet[2935]: E0115 02:01:13.206607 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.206866 kubelet[2935]: E0115 02:01:13.206823 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.206866 kubelet[2935]: W0115 02:01:13.206831 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.206866 kubelet[2935]: E0115 02:01:13.206839 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.207871 kubelet[2935]: E0115 02:01:13.207814 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.207871 kubelet[2935]: W0115 02:01:13.207825 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.207871 kubelet[2935]: E0115 02:01:13.207836 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.208170 kubelet[2935]: E0115 02:01:13.208106 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.208170 kubelet[2935]: W0115 02:01:13.208115 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.208170 kubelet[2935]: E0115 02:01:13.208124 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.208398 kubelet[2935]: E0115 02:01:13.208391 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.208485 kubelet[2935]: W0115 02:01:13.208433 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.208485 kubelet[2935]: E0115 02:01:13.208443 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.208991 kubelet[2935]: E0115 02:01:13.208926 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.209315 kubelet[2935]: W0115 02:01:13.209084 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.209315 kubelet[2935]: E0115 02:01:13.209097 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.209790 kubelet[2935]: E0115 02:01:13.209662 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.209790 kubelet[2935]: W0115 02:01:13.209672 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.209790 kubelet[2935]: E0115 02:01:13.209681 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.210317 kubelet[2935]: E0115 02:01:13.210178 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.210317 kubelet[2935]: W0115 02:01:13.210193 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.210317 kubelet[2935]: E0115 02:01:13.210205 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.210782 kubelet[2935]: E0115 02:01:13.210718 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.210782 kubelet[2935]: W0115 02:01:13.210727 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.210782 kubelet[2935]: E0115 02:01:13.210736 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.211254 kubelet[2935]: E0115 02:01:13.211201 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.211254 kubelet[2935]: W0115 02:01:13.211210 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.211254 kubelet[2935]: E0115 02:01:13.211220 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.211709 kubelet[2935]: E0115 02:01:13.211656 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.211709 kubelet[2935]: W0115 02:01:13.211666 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.211709 kubelet[2935]: E0115 02:01:13.211676 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.212030 kubelet[2935]: E0115 02:01:13.211994 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.212030 kubelet[2935]: W0115 02:01:13.212003 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.212030 kubelet[2935]: E0115 02:01:13.212012 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.212672 kubelet[2935]: E0115 02:01:13.212561 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.212672 kubelet[2935]: W0115 02:01:13.212571 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.212672 kubelet[2935]: E0115 02:01:13.212580 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.215335 containerd[1710]: time="2026-01-15T02:01:13.215306752Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-klwkl,Uid:fe38116a-5a64-40d4-9a99-99055821c317,Namespace:calico-system,Attempt:0,}" Jan 15 02:01:13.227988 kubelet[2935]: E0115 02:01:13.227834 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.227988 kubelet[2935]: W0115 02:01:13.227849 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.227988 kubelet[2935]: E0115 02:01:13.227860 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.227988 kubelet[2935]: I0115 02:01:13.227882 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/328d556f-d445-4769-97a7-1a5530a232c4-socket-dir\") pod \"csi-node-driver-hdqqp\" (UID: \"328d556f-d445-4769-97a7-1a5530a232c4\") " pod="calico-system/csi-node-driver-hdqqp" Jan 15 02:01:13.228176 kubelet[2935]: E0115 02:01:13.228159 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.228423 kubelet[2935]: W0115 02:01:13.228308 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.228423 kubelet[2935]: E0115 02:01:13.228322 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.228423 kubelet[2935]: I0115 02:01:13.228341 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ktgl\" (UniqueName: \"kubernetes.io/projected/328d556f-d445-4769-97a7-1a5530a232c4-kube-api-access-6ktgl\") pod \"csi-node-driver-hdqqp\" (UID: \"328d556f-d445-4769-97a7-1a5530a232c4\") " pod="calico-system/csi-node-driver-hdqqp" Jan 15 02:01:13.228694 kubelet[2935]: E0115 02:01:13.228683 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.228832 kubelet[2935]: W0115 02:01:13.228734 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.228832 kubelet[2935]: E0115 02:01:13.228745 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.228832 kubelet[2935]: I0115 02:01:13.228760 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/328d556f-d445-4769-97a7-1a5530a232c4-kubelet-dir\") pod \"csi-node-driver-hdqqp\" (UID: \"328d556f-d445-4769-97a7-1a5530a232c4\") " pod="calico-system/csi-node-driver-hdqqp" Jan 15 02:01:13.229138 kubelet[2935]: E0115 02:01:13.229014 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.229138 kubelet[2935]: W0115 02:01:13.229022 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.229138 kubelet[2935]: E0115 02:01:13.229030 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.229138 kubelet[2935]: I0115 02:01:13.229051 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/328d556f-d445-4769-97a7-1a5530a232c4-registration-dir\") pod \"csi-node-driver-hdqqp\" (UID: \"328d556f-d445-4769-97a7-1a5530a232c4\") " pod="calico-system/csi-node-driver-hdqqp" Jan 15 02:01:13.229482 kubelet[2935]: E0115 02:01:13.229466 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.229530 kubelet[2935]: W0115 02:01:13.229522 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.229579 kubelet[2935]: E0115 02:01:13.229573 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.229735 kubelet[2935]: I0115 02:01:13.229708 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/328d556f-d445-4769-97a7-1a5530a232c4-varrun\") pod \"csi-node-driver-hdqqp\" (UID: \"328d556f-d445-4769-97a7-1a5530a232c4\") " pod="calico-system/csi-node-driver-hdqqp" Jan 15 02:01:13.229806 kubelet[2935]: E0115 02:01:13.229801 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.229844 kubelet[2935]: W0115 02:01:13.229834 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.229908 kubelet[2935]: E0115 02:01:13.229878 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.230071 kubelet[2935]: E0115 02:01:13.230057 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.230071 kubelet[2935]: W0115 02:01:13.230064 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.230198 kubelet[2935]: E0115 02:01:13.230129 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.230799 kubelet[2935]: E0115 02:01:13.230688 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.230799 kubelet[2935]: W0115 02:01:13.230697 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.230799 kubelet[2935]: E0115 02:01:13.230717 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.231014 kubelet[2935]: E0115 02:01:13.230915 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.231014 kubelet[2935]: W0115 02:01:13.230921 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.231014 kubelet[2935]: E0115 02:01:13.230995 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.231231 kubelet[2935]: E0115 02:01:13.231116 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.231231 kubelet[2935]: W0115 02:01:13.231122 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.231231 kubelet[2935]: E0115 02:01:13.231179 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.231517 kubelet[2935]: E0115 02:01:13.231427 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.231517 kubelet[2935]: W0115 02:01:13.231435 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.231517 kubelet[2935]: E0115 02:01:13.231505 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.231754 kubelet[2935]: E0115 02:01:13.231747 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.231790 kubelet[2935]: W0115 02:01:13.231785 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.232020 kubelet[2935]: E0115 02:01:13.231974 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.232200 kubelet[2935]: E0115 02:01:13.232193 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.232246 kubelet[2935]: W0115 02:01:13.232240 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.232286 kubelet[2935]: E0115 02:01:13.232279 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.232508 kubelet[2935]: E0115 02:01:13.232501 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.232567 kubelet[2935]: W0115 02:01:13.232550 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.232567 kubelet[2935]: E0115 02:01:13.232559 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.232805 kubelet[2935]: E0115 02:01:13.232774 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.232805 kubelet[2935]: W0115 02:01:13.232782 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.232805 kubelet[2935]: E0115 02:01:13.232789 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.241815 containerd[1710]: time="2026-01-15T02:01:13.241778506Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-c5b6dfdcd-dqxw4,Uid:59eaf0bf-ed1f-4ebd-bf5d-3a45ac0ccc93,Namespace:calico-system,Attempt:0,} returns sandbox id \"c83e1bd24bcc10d3ff918dc1b05da1ce43d38f0ef1e94e6ee457537e6ef7ac17\"" Jan 15 02:01:13.244098 containerd[1710]: time="2026-01-15T02:01:13.243191371Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 15 02:01:13.249947 containerd[1710]: time="2026-01-15T02:01:13.249873300Z" level=info msg="connecting to shim 767b33bc1a2bdfa0e4a2648c65f7143f37c973fd663de3ea14aa87302928e144" address="unix:///run/containerd/s/bbfaede1380bd71675d129843fde107fac543ba8bb5b335042a48815bd8a3707" namespace=k8s.io protocol=ttrpc version=3 Jan 15 02:01:13.272306 systemd[1]: Started cri-containerd-767b33bc1a2bdfa0e4a2648c65f7143f37c973fd663de3ea14aa87302928e144.scope - libcontainer container 767b33bc1a2bdfa0e4a2648c65f7143f37c973fd663de3ea14aa87302928e144. Jan 15 02:01:13.283000 audit: BPF prog-id=156 op=LOAD Jan 15 02:01:13.284000 audit: BPF prog-id=157 op=LOAD Jan 15 02:01:13.284000 audit[3449]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.284000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3736376233336263316132626466613065346132363438633635663731 Jan 15 02:01:13.284000 audit: BPF prog-id=157 op=UNLOAD Jan 15 02:01:13.284000 audit[3449]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.284000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3736376233336263316132626466613065346132363438633635663731 Jan 15 02:01:13.284000 audit: BPF prog-id=158 op=LOAD Jan 15 02:01:13.284000 audit[3449]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.284000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3736376233336263316132626466613065346132363438633635663731 Jan 15 02:01:13.284000 audit: BPF prog-id=159 op=LOAD Jan 15 02:01:13.284000 audit[3449]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.284000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3736376233336263316132626466613065346132363438633635663731 Jan 15 02:01:13.284000 audit: BPF prog-id=159 op=UNLOAD Jan 15 02:01:13.284000 audit[3449]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.284000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3736376233336263316132626466613065346132363438633635663731 Jan 15 02:01:13.284000 audit: BPF prog-id=158 op=UNLOAD Jan 15 02:01:13.284000 audit[3449]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.284000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3736376233336263316132626466613065346132363438633635663731 Jan 15 02:01:13.284000 audit: BPF prog-id=160 op=LOAD Jan 15 02:01:13.284000 audit[3449]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.284000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3736376233336263316132626466613065346132363438633635663731 Jan 15 02:01:13.300249 containerd[1710]: time="2026-01-15T02:01:13.298770421Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-klwkl,Uid:fe38116a-5a64-40d4-9a99-99055821c317,Namespace:calico-system,Attempt:0,} returns sandbox id \"767b33bc1a2bdfa0e4a2648c65f7143f37c973fd663de3ea14aa87302928e144\"" Jan 15 02:01:13.330688 kubelet[2935]: E0115 02:01:13.330652 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.330758 kubelet[2935]: W0115 02:01:13.330749 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.330836 kubelet[2935]: E0115 02:01:13.330810 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.331067 kubelet[2935]: E0115 02:01:13.331057 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.331120 kubelet[2935]: W0115 02:01:13.331107 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.331223 kubelet[2935]: E0115 02:01:13.331146 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.331445 kubelet[2935]: E0115 02:01:13.331437 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.331553 kubelet[2935]: W0115 02:01:13.331479 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.331632 kubelet[2935]: E0115 02:01:13.331624 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.331802 kubelet[2935]: E0115 02:01:13.331682 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.331802 kubelet[2935]: W0115 02:01:13.331713 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.331802 kubelet[2935]: E0115 02:01:13.331721 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.331930 kubelet[2935]: E0115 02:01:13.331923 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.331985 kubelet[2935]: W0115 02:01:13.331969 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.332034 kubelet[2935]: E0115 02:01:13.332014 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.332279 kubelet[2935]: E0115 02:01:13.332262 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.332279 kubelet[2935]: W0115 02:01:13.332270 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.332400 kubelet[2935]: E0115 02:01:13.332351 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.332574 kubelet[2935]: E0115 02:01:13.332558 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.332574 kubelet[2935]: W0115 02:01:13.332565 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.332679 kubelet[2935]: E0115 02:01:13.332610 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.332862 kubelet[2935]: E0115 02:01:13.332846 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.332862 kubelet[2935]: W0115 02:01:13.332852 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.332988 kubelet[2935]: E0115 02:01:13.332903 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.333177 kubelet[2935]: E0115 02:01:13.333168 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.333220 kubelet[2935]: W0115 02:01:13.333214 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.333324 kubelet[2935]: E0115 02:01:13.333287 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.333819 kubelet[2935]: E0115 02:01:13.333512 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.333963 kubelet[2935]: W0115 02:01:13.333881 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.334026 kubelet[2935]: E0115 02:01:13.334011 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.334194 kubelet[2935]: E0115 02:01:13.334187 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.334257 kubelet[2935]: W0115 02:01:13.334238 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.334326 kubelet[2935]: E0115 02:01:13.334313 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.334535 kubelet[2935]: E0115 02:01:13.334519 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.334535 kubelet[2935]: W0115 02:01:13.334526 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.334664 kubelet[2935]: E0115 02:01:13.334628 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.334833 kubelet[2935]: E0115 02:01:13.334826 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.334902 kubelet[2935]: W0115 02:01:13.334853 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.335000 kubelet[2935]: E0115 02:01:13.334948 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.335089 kubelet[2935]: E0115 02:01:13.335083 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.335138 kubelet[2935]: W0115 02:01:13.335119 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.335257 kubelet[2935]: E0115 02:01:13.335250 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.335501 kubelet[2935]: E0115 02:01:13.335443 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.335573 kubelet[2935]: W0115 02:01:13.335553 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.335665 kubelet[2935]: E0115 02:01:13.335565 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.335894 kubelet[2935]: E0115 02:01:13.335879 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.335966 kubelet[2935]: W0115 02:01:13.335929 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.336005 kubelet[2935]: E0115 02:01:13.335999 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.337008 kubelet[2935]: E0115 02:01:13.336987 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.337094 kubelet[2935]: W0115 02:01:13.337007 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.337094 kubelet[2935]: E0115 02:01:13.337059 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.337215 kubelet[2935]: E0115 02:01:13.337203 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.337215 kubelet[2935]: W0115 02:01:13.337213 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.337294 kubelet[2935]: E0115 02:01:13.337281 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.337429 kubelet[2935]: E0115 02:01:13.337418 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.337429 kubelet[2935]: W0115 02:01:13.337428 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.337519 kubelet[2935]: E0115 02:01:13.337468 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.337622 kubelet[2935]: E0115 02:01:13.337605 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.337622 kubelet[2935]: W0115 02:01:13.337617 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.337697 kubelet[2935]: E0115 02:01:13.337640 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.337809 kubelet[2935]: E0115 02:01:13.337753 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.337809 kubelet[2935]: W0115 02:01:13.337759 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.337809 kubelet[2935]: E0115 02:01:13.337776 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.338060 kubelet[2935]: E0115 02:01:13.337907 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.338060 kubelet[2935]: W0115 02:01:13.337913 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.338060 kubelet[2935]: E0115 02:01:13.337924 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.338122 kubelet[2935]: E0115 02:01:13.338067 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.338122 kubelet[2935]: W0115 02:01:13.338073 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.338122 kubelet[2935]: E0115 02:01:13.338079 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.338327 kubelet[2935]: E0115 02:01:13.338259 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.338327 kubelet[2935]: W0115 02:01:13.338265 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.338327 kubelet[2935]: E0115 02:01:13.338271 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.338498 kubelet[2935]: E0115 02:01:13.338435 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.338498 kubelet[2935]: W0115 02:01:13.338441 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.338498 kubelet[2935]: E0115 02:01:13.338447 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.342168 kubelet[2935]: E0115 02:01:13.342110 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:13.342168 kubelet[2935]: W0115 02:01:13.342121 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:13.342168 kubelet[2935]: E0115 02:01:13.342132 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:13.793451 kernel: kauditd_printk_skb: 69 callbacks suppressed Jan 15 02:01:13.793637 kernel: audit: type=1325 audit(1768442473.789:557): table=filter:115 family=2 entries=22 op=nft_register_rule pid=3505 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:13.789000 audit[3505]: NETFILTER_CFG table=filter:115 family=2 entries=22 op=nft_register_rule pid=3505 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:13.789000 audit[3505]: SYSCALL arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffdb4d9c190 a2=0 a3=7ffdb4d9c17c items=0 ppid=3037 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.807917 kernel: audit: type=1300 audit(1768442473.789:557): arch=c000003e syscall=46 success=yes exit=8224 a0=3 a1=7ffdb4d9c190 a2=0 a3=7ffdb4d9c17c items=0 ppid=3037 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.789000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:13.821232 kernel: audit: type=1327 audit(1768442473.789:557): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:13.822000 audit[3505]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3505 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:13.829203 kernel: audit: type=1325 audit(1768442473.822:558): table=nat:116 family=2 entries=12 op=nft_register_rule pid=3505 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:13.822000 audit[3505]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffdb4d9c190 a2=0 a3=0 items=0 ppid=3037 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.839196 kernel: audit: type=1300 audit(1768442473.822:558): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffdb4d9c190 a2=0 a3=0 items=0 ppid=3037 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:13.822000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:13.845199 kernel: audit: type=1327 audit(1768442473.822:558): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:14.738774 kubelet[2935]: E0115 02:01:14.736796 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:01:14.855045 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2527384742.mount: Deactivated successfully. Jan 15 02:01:15.872884 containerd[1710]: time="2026-01-15T02:01:15.872831981Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:01:15.874291 containerd[1710]: time="2026-01-15T02:01:15.874260422Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=33735893" Jan 15 02:01:15.875782 containerd[1710]: time="2026-01-15T02:01:15.875740723Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:01:15.878890 containerd[1710]: time="2026-01-15T02:01:15.878842336Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:01:15.879565 containerd[1710]: time="2026-01-15T02:01:15.879517401Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 2.636287476s" Jan 15 02:01:15.879565 containerd[1710]: time="2026-01-15T02:01:15.879542876Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Jan 15 02:01:15.881167 containerd[1710]: time="2026-01-15T02:01:15.880916868Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 15 02:01:15.896000 containerd[1710]: time="2026-01-15T02:01:15.895969365Z" level=info msg="CreateContainer within sandbox \"c83e1bd24bcc10d3ff918dc1b05da1ce43d38f0ef1e94e6ee457537e6ef7ac17\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 15 02:01:15.912597 containerd[1710]: time="2026-01-15T02:01:15.912570932Z" level=info msg="Container 55de025c600f84f01cd81587e7dc1acfe15c3af78b568aa53ea8719a92d80505: CDI devices from CRI Config.CDIDevices: []" Jan 15 02:01:15.925352 containerd[1710]: time="2026-01-15T02:01:15.925288895Z" level=info msg="CreateContainer within sandbox \"c83e1bd24bcc10d3ff918dc1b05da1ce43d38f0ef1e94e6ee457537e6ef7ac17\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"55de025c600f84f01cd81587e7dc1acfe15c3af78b568aa53ea8719a92d80505\"" Jan 15 02:01:15.925662 containerd[1710]: time="2026-01-15T02:01:15.925642473Z" level=info msg="StartContainer for \"55de025c600f84f01cd81587e7dc1acfe15c3af78b568aa53ea8719a92d80505\"" Jan 15 02:01:15.927390 containerd[1710]: time="2026-01-15T02:01:15.927327534Z" level=info msg="connecting to shim 55de025c600f84f01cd81587e7dc1acfe15c3af78b568aa53ea8719a92d80505" address="unix:///run/containerd/s/89363750e11c1ae43c0acf87e48566f7cf194a4123e99cc64bd2ff6edea4215a" protocol=ttrpc version=3 Jan 15 02:01:15.947330 systemd[1]: Started cri-containerd-55de025c600f84f01cd81587e7dc1acfe15c3af78b568aa53ea8719a92d80505.scope - libcontainer container 55de025c600f84f01cd81587e7dc1acfe15c3af78b568aa53ea8719a92d80505. Jan 15 02:01:15.957000 audit: BPF prog-id=161 op=LOAD Jan 15 02:01:15.960166 kernel: audit: type=1334 audit(1768442475.957:559): prog-id=161 op=LOAD Jan 15 02:01:15.960000 audit: BPF prog-id=162 op=LOAD Jan 15 02:01:15.963187 kernel: audit: type=1334 audit(1768442475.960:560): prog-id=162 op=LOAD Jan 15 02:01:15.960000 audit[3516]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=3356 pid=3516 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:15.960000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535646530323563363030663834663031636438313538376537646331 Jan 15 02:01:15.969949 kernel: audit: type=1300 audit(1768442475.960:560): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0238 a2=98 a3=0 items=0 ppid=3356 pid=3516 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:15.970005 kernel: audit: type=1327 audit(1768442475.960:560): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535646530323563363030663834663031636438313538376537646331 Jan 15 02:01:15.962000 audit: BPF prog-id=162 op=UNLOAD Jan 15 02:01:15.962000 audit[3516]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3356 pid=3516 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:15.962000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535646530323563363030663834663031636438313538376537646331 Jan 15 02:01:15.962000 audit: BPF prog-id=163 op=LOAD Jan 15 02:01:15.962000 audit[3516]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a0488 a2=98 a3=0 items=0 ppid=3356 pid=3516 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:15.962000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535646530323563363030663834663031636438313538376537646331 Jan 15 02:01:15.962000 audit: BPF prog-id=164 op=LOAD Jan 15 02:01:15.962000 audit[3516]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001a0218 a2=98 a3=0 items=0 ppid=3356 pid=3516 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:15.962000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535646530323563363030663834663031636438313538376537646331 Jan 15 02:01:15.962000 audit: BPF prog-id=164 op=UNLOAD Jan 15 02:01:15.962000 audit[3516]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3356 pid=3516 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:15.962000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535646530323563363030663834663031636438313538376537646331 Jan 15 02:01:15.962000 audit: BPF prog-id=163 op=UNLOAD Jan 15 02:01:15.962000 audit[3516]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3356 pid=3516 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:15.962000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535646530323563363030663834663031636438313538376537646331 Jan 15 02:01:15.962000 audit: BPF prog-id=165 op=LOAD Jan 15 02:01:15.962000 audit[3516]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001a06e8 a2=98 a3=0 items=0 ppid=3356 pid=3516 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:15.962000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535646530323563363030663834663031636438313538376537646331 Jan 15 02:01:16.009282 containerd[1710]: time="2026-01-15T02:01:16.009241294Z" level=info msg="StartContainer for \"55de025c600f84f01cd81587e7dc1acfe15c3af78b568aa53ea8719a92d80505\" returns successfully" Jan 15 02:01:16.737841 kubelet[2935]: E0115 02:01:16.737679 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:01:16.896479 kubelet[2935]: I0115 02:01:16.895561 2935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-c5b6dfdcd-dqxw4" podStartSLOduration=2.258156028 podStartE2EDuration="4.895543664s" podCreationTimestamp="2026-01-15 02:01:12 +0000 UTC" firstStartedPulling="2026-01-15 02:01:13.242909453 +0000 UTC m=+29.620901273" lastFinishedPulling="2026-01-15 02:01:15.880297085 +0000 UTC m=+32.258288909" observedRunningTime="2026-01-15 02:01:16.894370804 +0000 UTC m=+33.272362733" watchObservedRunningTime="2026-01-15 02:01:16.895543664 +0000 UTC m=+33.273535541" Jan 15 02:01:16.942101 kubelet[2935]: E0115 02:01:16.942046 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.942235 kubelet[2935]: W0115 02:01:16.942103 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.942235 kubelet[2935]: E0115 02:01:16.942135 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.942559 kubelet[2935]: E0115 02:01:16.942533 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.942559 kubelet[2935]: W0115 02:01:16.942555 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.942635 kubelet[2935]: E0115 02:01:16.942574 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.942894 kubelet[2935]: E0115 02:01:16.942847 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.942894 kubelet[2935]: W0115 02:01:16.942890 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.942974 kubelet[2935]: E0115 02:01:16.942908 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.943420 kubelet[2935]: E0115 02:01:16.943401 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.943466 kubelet[2935]: W0115 02:01:16.943422 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.943498 kubelet[2935]: E0115 02:01:16.943465 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.943832 kubelet[2935]: E0115 02:01:16.943814 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.943874 kubelet[2935]: W0115 02:01:16.943834 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.943911 kubelet[2935]: E0115 02:01:16.943851 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.944210 kubelet[2935]: E0115 02:01:16.944191 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.944252 kubelet[2935]: W0115 02:01:16.944212 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.944291 kubelet[2935]: E0115 02:01:16.944260 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.944652 kubelet[2935]: E0115 02:01:16.944631 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.944688 kubelet[2935]: W0115 02:01:16.944657 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.944688 kubelet[2935]: E0115 02:01:16.944675 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.944993 kubelet[2935]: E0115 02:01:16.944954 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.945029 kubelet[2935]: W0115 02:01:16.944995 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.945029 kubelet[2935]: E0115 02:01:16.945011 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.945393 kubelet[2935]: E0115 02:01:16.945376 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.945430 kubelet[2935]: W0115 02:01:16.945396 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.945430 kubelet[2935]: E0115 02:01:16.945411 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.945717 kubelet[2935]: E0115 02:01:16.945700 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.945757 kubelet[2935]: W0115 02:01:16.945719 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.945757 kubelet[2935]: E0115 02:01:16.945735 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.946019 kubelet[2935]: E0115 02:01:16.946002 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.946054 kubelet[2935]: W0115 02:01:16.946022 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.946054 kubelet[2935]: E0115 02:01:16.946038 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.946355 kubelet[2935]: E0115 02:01:16.946338 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.946391 kubelet[2935]: W0115 02:01:16.946357 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.946391 kubelet[2935]: E0115 02:01:16.946374 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.946694 kubelet[2935]: E0115 02:01:16.946677 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.946732 kubelet[2935]: W0115 02:01:16.946697 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.946732 kubelet[2935]: E0115 02:01:16.946713 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.946993 kubelet[2935]: E0115 02:01:16.946976 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.947033 kubelet[2935]: W0115 02:01:16.946996 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.947033 kubelet[2935]: E0115 02:01:16.947011 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.947334 kubelet[2935]: E0115 02:01:16.947316 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.947381 kubelet[2935]: W0115 02:01:16.947336 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.947381 kubelet[2935]: E0115 02:01:16.947351 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.958247 kubelet[2935]: E0115 02:01:16.958215 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.958247 kubelet[2935]: W0115 02:01:16.958243 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.958362 kubelet[2935]: E0115 02:01:16.958264 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.958767 kubelet[2935]: E0115 02:01:16.958690 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.958767 kubelet[2935]: W0115 02:01:16.958714 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.958980 kubelet[2935]: E0115 02:01:16.958775 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.959354 kubelet[2935]: E0115 02:01:16.959329 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.959534 kubelet[2935]: W0115 02:01:16.959510 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.959704 kubelet[2935]: E0115 02:01:16.959567 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.959971 kubelet[2935]: E0115 02:01:16.959935 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.959971 kubelet[2935]: W0115 02:01:16.959962 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.960073 kubelet[2935]: E0115 02:01:16.959983 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.962199 kubelet[2935]: E0115 02:01:16.962142 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.962199 kubelet[2935]: W0115 02:01:16.962199 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.962314 kubelet[2935]: E0115 02:01:16.962224 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.963941 kubelet[2935]: E0115 02:01:16.963910 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.964004 kubelet[2935]: W0115 02:01:16.963941 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.964004 kubelet[2935]: E0115 02:01:16.963963 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.964415 kubelet[2935]: E0115 02:01:16.964391 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.964468 kubelet[2935]: W0115 02:01:16.964416 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.964468 kubelet[2935]: E0115 02:01:16.964436 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.967093 kubelet[2935]: E0115 02:01:16.966962 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.967093 kubelet[2935]: W0115 02:01:16.966979 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.967093 kubelet[2935]: E0115 02:01:16.966994 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.967479 kubelet[2935]: E0115 02:01:16.967433 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.967479 kubelet[2935]: W0115 02:01:16.967446 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.967638 kubelet[2935]: E0115 02:01:16.967458 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.967959 kubelet[2935]: E0115 02:01:16.967915 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.967959 kubelet[2935]: W0115 02:01:16.967927 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.968147 kubelet[2935]: E0115 02:01:16.967939 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.969315 kubelet[2935]: E0115 02:01:16.969244 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.969763 kubelet[2935]: W0115 02:01:16.969666 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.970822 kubelet[2935]: E0115 02:01:16.970623 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.971080 kubelet[2935]: E0115 02:01:16.970964 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.971360 kubelet[2935]: W0115 02:01:16.970979 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.972433 kubelet[2935]: E0115 02:01:16.971945 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.973088 kubelet[2935]: E0115 02:01:16.972996 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.973192 kubelet[2935]: W0115 02:01:16.973176 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.974347 kubelet[2935]: E0115 02:01:16.974330 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.974468 kubelet[2935]: W0115 02:01:16.974455 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.974534 kubelet[2935]: E0115 02:01:16.974522 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.974759 kubelet[2935]: E0115 02:01:16.974733 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.975165 kubelet[2935]: E0115 02:01:16.975094 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.975165 kubelet[2935]: W0115 02:01:16.975106 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.975165 kubelet[2935]: E0115 02:01:16.975119 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.976491 kubelet[2935]: E0115 02:01:16.976382 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.976491 kubelet[2935]: W0115 02:01:16.976397 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.976491 kubelet[2935]: E0115 02:01:16.976411 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.976817 kubelet[2935]: E0115 02:01:16.976807 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.977133 kubelet[2935]: W0115 02:01:16.976877 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.977133 kubelet[2935]: E0115 02:01:16.976892 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:16.977407 kubelet[2935]: E0115 02:01:16.977396 2935 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 15 02:01:16.977465 kubelet[2935]: W0115 02:01:16.977456 2935 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 15 02:01:16.978207 kubelet[2935]: E0115 02:01:16.978188 2935 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 15 02:01:17.575840 containerd[1710]: time="2026-01-15T02:01:17.575333131Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:01:17.577534 containerd[1710]: time="2026-01-15T02:01:17.577516493Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 15 02:01:17.579398 containerd[1710]: time="2026-01-15T02:01:17.579384203Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:01:17.583237 containerd[1710]: time="2026-01-15T02:01:17.583207841Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:01:17.583818 containerd[1710]: time="2026-01-15T02:01:17.583794464Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.702850849s" Jan 15 02:01:17.583909 containerd[1710]: time="2026-01-15T02:01:17.583895034Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Jan 15 02:01:17.585483 containerd[1710]: time="2026-01-15T02:01:17.585457733Z" level=info msg="CreateContainer within sandbox \"767b33bc1a2bdfa0e4a2648c65f7143f37c973fd663de3ea14aa87302928e144\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 15 02:01:17.601880 containerd[1710]: time="2026-01-15T02:01:17.601857995Z" level=info msg="Container e209b8a9d214d3285c01203fd1cc749ebdf7aa79031ab0be28967a27b9c7c4f7: CDI devices from CRI Config.CDIDevices: []" Jan 15 02:01:17.608483 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2077738547.mount: Deactivated successfully. Jan 15 02:01:17.621841 containerd[1710]: time="2026-01-15T02:01:17.621805115Z" level=info msg="CreateContainer within sandbox \"767b33bc1a2bdfa0e4a2648c65f7143f37c973fd663de3ea14aa87302928e144\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e209b8a9d214d3285c01203fd1cc749ebdf7aa79031ab0be28967a27b9c7c4f7\"" Jan 15 02:01:17.622474 containerd[1710]: time="2026-01-15T02:01:17.622434060Z" level=info msg="StartContainer for \"e209b8a9d214d3285c01203fd1cc749ebdf7aa79031ab0be28967a27b9c7c4f7\"" Jan 15 02:01:17.624448 containerd[1710]: time="2026-01-15T02:01:17.624421710Z" level=info msg="connecting to shim e209b8a9d214d3285c01203fd1cc749ebdf7aa79031ab0be28967a27b9c7c4f7" address="unix:///run/containerd/s/bbfaede1380bd71675d129843fde107fac543ba8bb5b335042a48815bd8a3707" protocol=ttrpc version=3 Jan 15 02:01:17.652456 systemd[1]: Started cri-containerd-e209b8a9d214d3285c01203fd1cc749ebdf7aa79031ab0be28967a27b9c7c4f7.scope - libcontainer container e209b8a9d214d3285c01203fd1cc749ebdf7aa79031ab0be28967a27b9c7c4f7. Jan 15 02:01:17.698000 audit: BPF prog-id=166 op=LOAD Jan 15 02:01:17.698000 audit[3592]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000138488 a2=98 a3=0 items=0 ppid=3437 pid=3592 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:17.698000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532303962386139643231346433323835633031323033666431636337 Jan 15 02:01:17.698000 audit: BPF prog-id=167 op=LOAD Jan 15 02:01:17.698000 audit[3592]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000138218 a2=98 a3=0 items=0 ppid=3437 pid=3592 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:17.698000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532303962386139643231346433323835633031323033666431636337 Jan 15 02:01:17.698000 audit: BPF prog-id=167 op=UNLOAD Jan 15 02:01:17.698000 audit[3592]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3592 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:17.698000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532303962386139643231346433323835633031323033666431636337 Jan 15 02:01:17.698000 audit: BPF prog-id=166 op=UNLOAD Jan 15 02:01:17.698000 audit[3592]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3592 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:17.698000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532303962386139643231346433323835633031323033666431636337 Jan 15 02:01:17.698000 audit: BPF prog-id=168 op=LOAD Jan 15 02:01:17.698000 audit[3592]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001386e8 a2=98 a3=0 items=0 ppid=3437 pid=3592 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:17.698000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6532303962386139643231346433323835633031323033666431636337 Jan 15 02:01:17.729140 containerd[1710]: time="2026-01-15T02:01:17.729070079Z" level=info msg="StartContainer for \"e209b8a9d214d3285c01203fd1cc749ebdf7aa79031ab0be28967a27b9c7c4f7\" returns successfully" Jan 15 02:01:17.741723 systemd[1]: cri-containerd-e209b8a9d214d3285c01203fd1cc749ebdf7aa79031ab0be28967a27b9c7c4f7.scope: Deactivated successfully. Jan 15 02:01:17.745077 containerd[1710]: time="2026-01-15T02:01:17.745047140Z" level=info msg="received container exit event container_id:\"e209b8a9d214d3285c01203fd1cc749ebdf7aa79031ab0be28967a27b9c7c4f7\" id:\"e209b8a9d214d3285c01203fd1cc749ebdf7aa79031ab0be28967a27b9c7c4f7\" pid:3605 exited_at:{seconds:1768442477 nanos:744659736}" Jan 15 02:01:17.744000 audit: BPF prog-id=168 op=UNLOAD Jan 15 02:01:17.769042 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e209b8a9d214d3285c01203fd1cc749ebdf7aa79031ab0be28967a27b9c7c4f7-rootfs.mount: Deactivated successfully. Jan 15 02:01:18.167334 kubelet[2935]: I0115 02:01:17.882670 2935 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 15 02:01:18.737293 kubelet[2935]: E0115 02:01:18.737200 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:01:19.897391 containerd[1710]: time="2026-01-15T02:01:19.897287883Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 15 02:01:20.511965 kubelet[2935]: I0115 02:01:20.511453 2935 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 15 02:01:20.581000 audit[3644]: NETFILTER_CFG table=filter:117 family=2 entries=21 op=nft_register_rule pid=3644 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:20.584191 kernel: kauditd_printk_skb: 34 callbacks suppressed Jan 15 02:01:20.584281 kernel: audit: type=1325 audit(1768442480.581:573): table=filter:117 family=2 entries=21 op=nft_register_rule pid=3644 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:20.581000 audit[3644]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7fffeb316440 a2=0 a3=7fffeb31642c items=0 ppid=3037 pid=3644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:20.581000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:20.609788 kernel: audit: type=1300 audit(1768442480.581:573): arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7fffeb316440 a2=0 a3=7fffeb31642c items=0 ppid=3037 pid=3644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:20.609875 kernel: audit: type=1327 audit(1768442480.581:573): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:20.593000 audit[3644]: NETFILTER_CFG table=nat:118 family=2 entries=19 op=nft_register_chain pid=3644 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:20.593000 audit[3644]: SYSCALL arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7fffeb316440 a2=0 a3=7fffeb31642c items=0 ppid=3037 pid=3644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:20.622566 kernel: audit: type=1325 audit(1768442480.593:574): table=nat:118 family=2 entries=19 op=nft_register_chain pid=3644 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:20.622650 kernel: audit: type=1300 audit(1768442480.593:574): arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7fffeb316440 a2=0 a3=7fffeb31642c items=0 ppid=3037 pid=3644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:20.593000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:20.629539 kernel: audit: type=1327 audit(1768442480.593:574): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:20.738112 kubelet[2935]: E0115 02:01:20.737520 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:01:22.737178 kubelet[2935]: E0115 02:01:22.736893 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:01:23.900749 containerd[1710]: time="2026-01-15T02:01:23.900655738Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:01:23.902301 containerd[1710]: time="2026-01-15T02:01:23.902283363Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70442291" Jan 15 02:01:23.904304 containerd[1710]: time="2026-01-15T02:01:23.904282893Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:01:23.906385 containerd[1710]: time="2026-01-15T02:01:23.906346349Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:01:23.906755 containerd[1710]: time="2026-01-15T02:01:23.906690560Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 4.009341937s" Jan 15 02:01:23.906755 containerd[1710]: time="2026-01-15T02:01:23.906712332Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Jan 15 02:01:23.910181 containerd[1710]: time="2026-01-15T02:01:23.909562319Z" level=info msg="CreateContainer within sandbox \"767b33bc1a2bdfa0e4a2648c65f7143f37c973fd663de3ea14aa87302928e144\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 15 02:01:23.925331 containerd[1710]: time="2026-01-15T02:01:23.925311947Z" level=info msg="Container d683b6aaaa94b974cba6ef8bd9da9b18dc863d5b4f258f6a7b50defc0a682971: CDI devices from CRI Config.CDIDevices: []" Jan 15 02:01:23.939691 containerd[1710]: time="2026-01-15T02:01:23.939665409Z" level=info msg="CreateContainer within sandbox \"767b33bc1a2bdfa0e4a2648c65f7143f37c973fd663de3ea14aa87302928e144\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"d683b6aaaa94b974cba6ef8bd9da9b18dc863d5b4f258f6a7b50defc0a682971\"" Jan 15 02:01:23.940052 containerd[1710]: time="2026-01-15T02:01:23.940036248Z" level=info msg="StartContainer for \"d683b6aaaa94b974cba6ef8bd9da9b18dc863d5b4f258f6a7b50defc0a682971\"" Jan 15 02:01:23.942332 containerd[1710]: time="2026-01-15T02:01:23.942309742Z" level=info msg="connecting to shim d683b6aaaa94b974cba6ef8bd9da9b18dc863d5b4f258f6a7b50defc0a682971" address="unix:///run/containerd/s/bbfaede1380bd71675d129843fde107fac543ba8bb5b335042a48815bd8a3707" protocol=ttrpc version=3 Jan 15 02:01:23.960336 systemd[1]: Started cri-containerd-d683b6aaaa94b974cba6ef8bd9da9b18dc863d5b4f258f6a7b50defc0a682971.scope - libcontainer container d683b6aaaa94b974cba6ef8bd9da9b18dc863d5b4f258f6a7b50defc0a682971. Jan 15 02:01:24.009000 audit: BPF prog-id=169 op=LOAD Jan 15 02:01:24.009000 audit[3656]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=3437 pid=3656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:24.014292 kernel: audit: type=1334 audit(1768442484.009:575): prog-id=169 op=LOAD Jan 15 02:01:24.014333 kernel: audit: type=1300 audit(1768442484.009:575): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=3437 pid=3656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:24.009000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436383362366161616139346239373463626136656638626439646139 Jan 15 02:01:24.018818 kernel: audit: type=1327 audit(1768442484.009:575): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436383362366161616139346239373463626136656638626439646139 Jan 15 02:01:24.009000 audit: BPF prog-id=170 op=LOAD Jan 15 02:01:24.021996 kernel: audit: type=1334 audit(1768442484.009:576): prog-id=170 op=LOAD Jan 15 02:01:24.009000 audit[3656]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=3437 pid=3656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:24.009000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436383362366161616139346239373463626136656638626439646139 Jan 15 02:01:24.009000 audit: BPF prog-id=170 op=UNLOAD Jan 15 02:01:24.009000 audit[3656]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:24.009000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436383362366161616139346239373463626136656638626439646139 Jan 15 02:01:24.009000 audit: BPF prog-id=169 op=UNLOAD Jan 15 02:01:24.009000 audit[3656]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:24.009000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436383362366161616139346239373463626136656638626439646139 Jan 15 02:01:24.009000 audit: BPF prog-id=171 op=LOAD Jan 15 02:01:24.009000 audit[3656]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001786e8 a2=98 a3=0 items=0 ppid=3437 pid=3656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:24.009000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6436383362366161616139346239373463626136656638626439646139 Jan 15 02:01:24.041817 containerd[1710]: time="2026-01-15T02:01:24.041629663Z" level=info msg="StartContainer for \"d683b6aaaa94b974cba6ef8bd9da9b18dc863d5b4f258f6a7b50defc0a682971\" returns successfully" Jan 15 02:01:24.738868 kubelet[2935]: E0115 02:01:24.737470 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:01:25.436420 containerd[1710]: time="2026-01-15T02:01:25.436340220Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 15 02:01:25.439875 systemd[1]: cri-containerd-d683b6aaaa94b974cba6ef8bd9da9b18dc863d5b4f258f6a7b50defc0a682971.scope: Deactivated successfully. Jan 15 02:01:25.441744 systemd[1]: cri-containerd-d683b6aaaa94b974cba6ef8bd9da9b18dc863d5b4f258f6a7b50defc0a682971.scope: Consumed 737ms CPU time, 197.1M memory peak, 171.3M written to disk. Jan 15 02:01:25.442826 containerd[1710]: time="2026-01-15T02:01:25.442703081Z" level=info msg="received container exit event container_id:\"d683b6aaaa94b974cba6ef8bd9da9b18dc863d5b4f258f6a7b50defc0a682971\" id:\"d683b6aaaa94b974cba6ef8bd9da9b18dc863d5b4f258f6a7b50defc0a682971\" pid:3668 exited_at:{seconds:1768442485 nanos:441971011}" Jan 15 02:01:25.443000 audit: BPF prog-id=171 op=UNLOAD Jan 15 02:01:25.475715 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d683b6aaaa94b974cba6ef8bd9da9b18dc863d5b4f258f6a7b50defc0a682971-rootfs.mount: Deactivated successfully. Jan 15 02:01:25.499407 kubelet[2935]: I0115 02:01:25.499343 2935 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 15 02:01:26.098136 kubelet[2935]: W0115 02:01:25.549746 2935 reflector.go:569] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4515-1-0-n-e5e35ee394" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4515-1-0-n-e5e35ee394' and this object Jan 15 02:01:26.098136 kubelet[2935]: E0115 02:01:25.549776 2935 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4515-1-0-n-e5e35ee394\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4515-1-0-n-e5e35ee394' and this object" logger="UnhandledError" Jan 15 02:01:26.098136 kubelet[2935]: W0115 02:01:25.549908 2935 reflector.go:569] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ci-4515-1-0-n-e5e35ee394" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4515-1-0-n-e5e35ee394' and this object Jan 15 02:01:26.098136 kubelet[2935]: E0115 02:01:25.549922 2935 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:ci-4515-1-0-n-e5e35ee394\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4515-1-0-n-e5e35ee394' and this object" logger="UnhandledError" Jan 15 02:01:25.548906 systemd[1]: Created slice kubepods-burstable-pod2ea7b673_cd9f_4a24_98cf_dc76e6ed54b4.slice - libcontainer container kubepods-burstable-pod2ea7b673_cd9f_4a24_98cf_dc76e6ed54b4.slice. Jan 15 02:01:26.099868 kubelet[2935]: W0115 02:01:25.548145 2935 reflector.go:569] object-"calico-system"/"whisker-backend-key-pair": failed to list *v1.Secret: secrets "whisker-backend-key-pair" is forbidden: User "system:node:ci-4515-1-0-n-e5e35ee394" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4515-1-0-n-e5e35ee394' and this object Jan 15 02:01:26.099868 kubelet[2935]: E0115 02:01:25.550230 2935 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"whisker-backend-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"whisker-backend-key-pair\" is forbidden: User \"system:node:ci-4515-1-0-n-e5e35ee394\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4515-1-0-n-e5e35ee394' and this object" logger="UnhandledError" Jan 15 02:01:26.099868 kubelet[2935]: W0115 02:01:25.551600 2935 reflector.go:569] object-"calico-system"/"whisker-ca-bundle": failed to list *v1.ConfigMap: configmaps "whisker-ca-bundle" is forbidden: User "system:node:ci-4515-1-0-n-e5e35ee394" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4515-1-0-n-e5e35ee394' and this object Jan 15 02:01:26.099868 kubelet[2935]: E0115 02:01:25.551622 2935 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"whisker-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"whisker-ca-bundle\" is forbidden: User \"system:node:ci-4515-1-0-n-e5e35ee394\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4515-1-0-n-e5e35ee394' and this object" logger="UnhandledError" Jan 15 02:01:25.558744 systemd[1]: Created slice kubepods-besteffort-pod05de6337_cbb1_44dd_97eb_3966ed3ddde0.slice - libcontainer container kubepods-besteffort-pod05de6337_cbb1_44dd_97eb_3966ed3ddde0.slice. Jan 15 02:01:26.100349 kubelet[2935]: I0115 02:01:25.625825 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85d7699e-6a0a-4fd1-96b0-9365b90d23ad-tigera-ca-bundle\") pod \"calico-kube-controllers-7b978c6cc9-jnxw2\" (UID: \"85d7699e-6a0a-4fd1-96b0-9365b90d23ad\") " pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" Jan 15 02:01:26.100349 kubelet[2935]: I0115 02:01:25.625849 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf5cf22e-5674-4890-9d47-13e5c96cd4be-config-volume\") pod \"coredns-668d6bf9bc-ljpnb\" (UID: \"bf5cf22e-5674-4890-9d47-13e5c96cd4be\") " pod="kube-system/coredns-668d6bf9bc-ljpnb" Jan 15 02:01:26.100349 kubelet[2935]: I0115 02:01:25.625866 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkwh7\" (UniqueName: \"kubernetes.io/projected/7529b612-ddf2-4fb7-9823-720a3bd71760-kube-api-access-hkwh7\") pod \"goldmane-666569f655-5x8dj\" (UID: \"7529b612-ddf2-4fb7-9823-720a3bd71760\") " pod="calico-system/goldmane-666569f655-5x8dj" Jan 15 02:01:26.100349 kubelet[2935]: I0115 02:01:25.625883 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/7529b612-ddf2-4fb7-9823-720a3bd71760-goldmane-key-pair\") pod \"goldmane-666569f655-5x8dj\" (UID: \"7529b612-ddf2-4fb7-9823-720a3bd71760\") " pod="calico-system/goldmane-666569f655-5x8dj" Jan 15 02:01:26.100349 kubelet[2935]: I0115 02:01:25.625898 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/768894fc-e95b-49e5-9a90-1487b94ce02a-calico-apiserver-certs\") pod \"calico-apiserver-5dcd9489f8-x7ppt\" (UID: \"768894fc-e95b-49e5-9a90-1487b94ce02a\") " pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" Jan 15 02:01:25.567521 systemd[1]: Created slice kubepods-besteffort-pod19301f0b_a1cd_4987_b493_85e61d59a457.slice - libcontainer container kubepods-besteffort-pod19301f0b_a1cd_4987_b493_85e61d59a457.slice. Jan 15 02:01:26.100936 kubelet[2935]: I0115 02:01:25.625912 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/19301f0b-a1cd-4987-b493-85e61d59a457-calico-apiserver-certs\") pod \"calico-apiserver-5dcd9489f8-9mlsx\" (UID: \"19301f0b-a1cd-4987-b493-85e61d59a457\") " pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" Jan 15 02:01:26.100936 kubelet[2935]: I0115 02:01:25.625926 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7529b612-ddf2-4fb7-9823-720a3bd71760-goldmane-ca-bundle\") pod \"goldmane-666569f655-5x8dj\" (UID: \"7529b612-ddf2-4fb7-9823-720a3bd71760\") " pod="calico-system/goldmane-666569f655-5x8dj" Jan 15 02:01:26.100936 kubelet[2935]: I0115 02:01:25.625942 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs4sj\" (UniqueName: \"kubernetes.io/projected/2ea7b673-cd9f-4a24-98cf-dc76e6ed54b4-kube-api-access-xs4sj\") pod \"coredns-668d6bf9bc-hpxzl\" (UID: \"2ea7b673-cd9f-4a24-98cf-dc76e6ed54b4\") " pod="kube-system/coredns-668d6bf9bc-hpxzl" Jan 15 02:01:26.100936 kubelet[2935]: I0115 02:01:25.625957 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vczts\" (UniqueName: \"kubernetes.io/projected/05de6337-cbb1-44dd-97eb-3966ed3ddde0-kube-api-access-vczts\") pod \"whisker-c87d4cc77-45zsr\" (UID: \"05de6337-cbb1-44dd-97eb-3966ed3ddde0\") " pod="calico-system/whisker-c87d4cc77-45zsr" Jan 15 02:01:26.100936 kubelet[2935]: I0115 02:01:25.625970 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzplf\" (UniqueName: \"kubernetes.io/projected/768894fc-e95b-49e5-9a90-1487b94ce02a-kube-api-access-hzplf\") pod \"calico-apiserver-5dcd9489f8-x7ppt\" (UID: \"768894fc-e95b-49e5-9a90-1487b94ce02a\") " pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" Jan 15 02:01:25.574008 systemd[1]: Created slice kubepods-besteffort-pod85d7699e_6a0a_4fd1_96b0_9365b90d23ad.slice - libcontainer container kubepods-besteffort-pod85d7699e_6a0a_4fd1_96b0_9365b90d23ad.slice. Jan 15 02:01:26.101486 kubelet[2935]: I0115 02:01:25.625985 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xk5t\" (UniqueName: \"kubernetes.io/projected/85d7699e-6a0a-4fd1-96b0-9365b90d23ad-kube-api-access-2xk5t\") pod \"calico-kube-controllers-7b978c6cc9-jnxw2\" (UID: \"85d7699e-6a0a-4fd1-96b0-9365b90d23ad\") " pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" Jan 15 02:01:26.101486 kubelet[2935]: I0115 02:01:25.626002 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzlsr\" (UniqueName: \"kubernetes.io/projected/19301f0b-a1cd-4987-b493-85e61d59a457-kube-api-access-nzlsr\") pod \"calico-apiserver-5dcd9489f8-9mlsx\" (UID: \"19301f0b-a1cd-4987-b493-85e61d59a457\") " pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" Jan 15 02:01:26.101486 kubelet[2935]: I0115 02:01:25.626139 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg46z\" (UniqueName: \"kubernetes.io/projected/bf5cf22e-5674-4890-9d47-13e5c96cd4be-kube-api-access-fg46z\") pod \"coredns-668d6bf9bc-ljpnb\" (UID: \"bf5cf22e-5674-4890-9d47-13e5c96cd4be\") " pod="kube-system/coredns-668d6bf9bc-ljpnb" Jan 15 02:01:26.101486 kubelet[2935]: I0115 02:01:25.626179 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7529b612-ddf2-4fb7-9823-720a3bd71760-config\") pod \"goldmane-666569f655-5x8dj\" (UID: \"7529b612-ddf2-4fb7-9823-720a3bd71760\") " pod="calico-system/goldmane-666569f655-5x8dj" Jan 15 02:01:26.101486 kubelet[2935]: I0115 02:01:25.626195 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/05de6337-cbb1-44dd-97eb-3966ed3ddde0-whisker-backend-key-pair\") pod \"whisker-c87d4cc77-45zsr\" (UID: \"05de6337-cbb1-44dd-97eb-3966ed3ddde0\") " pod="calico-system/whisker-c87d4cc77-45zsr" Jan 15 02:01:25.582240 systemd[1]: Created slice kubepods-besteffort-pod7529b612_ddf2_4fb7_9823_720a3bd71760.slice - libcontainer container kubepods-besteffort-pod7529b612_ddf2_4fb7_9823_720a3bd71760.slice. Jan 15 02:01:26.102040 kubelet[2935]: I0115 02:01:25.626212 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ea7b673-cd9f-4a24-98cf-dc76e6ed54b4-config-volume\") pod \"coredns-668d6bf9bc-hpxzl\" (UID: \"2ea7b673-cd9f-4a24-98cf-dc76e6ed54b4\") " pod="kube-system/coredns-668d6bf9bc-hpxzl" Jan 15 02:01:26.102040 kubelet[2935]: I0115 02:01:25.626226 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05de6337-cbb1-44dd-97eb-3966ed3ddde0-whisker-ca-bundle\") pod \"whisker-c87d4cc77-45zsr\" (UID: \"05de6337-cbb1-44dd-97eb-3966ed3ddde0\") " pod="calico-system/whisker-c87d4cc77-45zsr" Jan 15 02:01:25.588388 systemd[1]: Created slice kubepods-burstable-podbf5cf22e_5674_4890_9d47_13e5c96cd4be.slice - libcontainer container kubepods-burstable-podbf5cf22e_5674_4890_9d47_13e5c96cd4be.slice. Jan 15 02:01:25.596485 systemd[1]: Created slice kubepods-besteffort-pod768894fc_e95b_49e5_9a90_1487b94ce02a.slice - libcontainer container kubepods-besteffort-pod768894fc_e95b_49e5_9a90_1487b94ce02a.slice. Jan 15 02:01:26.399495 containerd[1710]: time="2026-01-15T02:01:26.398629988Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hpxzl,Uid:2ea7b673-cd9f-4a24-98cf-dc76e6ed54b4,Namespace:kube-system,Attempt:0,}" Jan 15 02:01:26.411191 containerd[1710]: time="2026-01-15T02:01:26.411110484Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-5x8dj,Uid:7529b612-ddf2-4fb7-9823-720a3bd71760,Namespace:calico-system,Attempt:0,}" Jan 15 02:01:26.411379 containerd[1710]: time="2026-01-15T02:01:26.411338504Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ljpnb,Uid:bf5cf22e-5674-4890-9d47-13e5c96cd4be,Namespace:kube-system,Attempt:0,}" Jan 15 02:01:26.417290 containerd[1710]: time="2026-01-15T02:01:26.417144335Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b978c6cc9-jnxw2,Uid:85d7699e-6a0a-4fd1-96b0-9365b90d23ad,Namespace:calico-system,Attempt:0,}" Jan 15 02:01:26.735838 kubelet[2935]: E0115 02:01:26.735679 2935 configmap.go:193] Couldn't get configMap calico-system/whisker-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Jan 15 02:01:26.736007 kubelet[2935]: E0115 02:01:26.735851 2935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/05de6337-cbb1-44dd-97eb-3966ed3ddde0-whisker-ca-bundle podName:05de6337-cbb1-44dd-97eb-3966ed3ddde0 nodeName:}" failed. No retries permitted until 2026-01-15 02:01:27.235790954 +0000 UTC m=+43.613782835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "whisker-ca-bundle" (UniqueName: "kubernetes.io/configmap/05de6337-cbb1-44dd-97eb-3966ed3ddde0-whisker-ca-bundle") pod "whisker-c87d4cc77-45zsr" (UID: "05de6337-cbb1-44dd-97eb-3966ed3ddde0") : failed to sync configmap cache: timed out waiting for the condition Jan 15 02:01:26.737259 kubelet[2935]: E0115 02:01:26.736325 2935 secret.go:189] Couldn't get secret calico-system/whisker-backend-key-pair: failed to sync secret cache: timed out waiting for the condition Jan 15 02:01:26.737259 kubelet[2935]: E0115 02:01:26.736455 2935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05de6337-cbb1-44dd-97eb-3966ed3ddde0-whisker-backend-key-pair podName:05de6337-cbb1-44dd-97eb-3966ed3ddde0 nodeName:}" failed. No retries permitted until 2026-01-15 02:01:27.236416271 +0000 UTC m=+43.614408155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "whisker-backend-key-pair" (UniqueName: "kubernetes.io/secret/05de6337-cbb1-44dd-97eb-3966ed3ddde0-whisker-backend-key-pair") pod "whisker-c87d4cc77-45zsr" (UID: "05de6337-cbb1-44dd-97eb-3966ed3ddde0") : failed to sync secret cache: timed out waiting for the condition Jan 15 02:01:26.737259 kubelet[2935]: E0115 02:01:26.736928 2935 secret.go:189] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Jan 15 02:01:26.737259 kubelet[2935]: E0115 02:01:26.737023 2935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/768894fc-e95b-49e5-9a90-1487b94ce02a-calico-apiserver-certs podName:768894fc-e95b-49e5-9a90-1487b94ce02a nodeName:}" failed. No retries permitted until 2026-01-15 02:01:27.236989717 +0000 UTC m=+43.614981603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/768894fc-e95b-49e5-9a90-1487b94ce02a-calico-apiserver-certs") pod "calico-apiserver-5dcd9489f8-x7ppt" (UID: "768894fc-e95b-49e5-9a90-1487b94ce02a") : failed to sync secret cache: timed out waiting for the condition Jan 15 02:01:26.741768 kubelet[2935]: E0115 02:01:26.741660 2935 secret.go:189] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Jan 15 02:01:26.742107 kubelet[2935]: E0115 02:01:26.742086 2935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19301f0b-a1cd-4987-b493-85e61d59a457-calico-apiserver-certs podName:19301f0b-a1cd-4987-b493-85e61d59a457 nodeName:}" failed. No retries permitted until 2026-01-15 02:01:27.242012784 +0000 UTC m=+43.620004670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/19301f0b-a1cd-4987-b493-85e61d59a457-calico-apiserver-certs") pod "calico-apiserver-5dcd9489f8-9mlsx" (UID: "19301f0b-a1cd-4987-b493-85e61d59a457") : failed to sync secret cache: timed out waiting for the condition Jan 15 02:01:26.756086 systemd[1]: Created slice kubepods-besteffort-pod328d556f_d445_4769_97a7_1a5530a232c4.slice - libcontainer container kubepods-besteffort-pod328d556f_d445_4769_97a7_1a5530a232c4.slice. Jan 15 02:01:26.774503 containerd[1710]: time="2026-01-15T02:01:26.771137698Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hdqqp,Uid:328d556f-d445-4769-97a7-1a5530a232c4,Namespace:calico-system,Attempt:0,}" Jan 15 02:01:26.931954 containerd[1710]: time="2026-01-15T02:01:26.931872286Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 15 02:01:27.023229 containerd[1710]: time="2026-01-15T02:01:27.023115513Z" level=error msg="Failed to destroy network for sandbox \"2dec75f773bd26b67546cec1756b649c5d268335472c820a1e4ccd98d58cc5aa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.028981 containerd[1710]: time="2026-01-15T02:01:27.028945063Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ljpnb,Uid:bf5cf22e-5674-4890-9d47-13e5c96cd4be,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2dec75f773bd26b67546cec1756b649c5d268335472c820a1e4ccd98d58cc5aa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.029353 kubelet[2935]: E0115 02:01:27.029265 2935 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2dec75f773bd26b67546cec1756b649c5d268335472c820a1e4ccd98d58cc5aa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.029353 kubelet[2935]: E0115 02:01:27.029320 2935 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2dec75f773bd26b67546cec1756b649c5d268335472c820a1e4ccd98d58cc5aa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-ljpnb" Jan 15 02:01:27.029353 kubelet[2935]: E0115 02:01:27.029337 2935 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2dec75f773bd26b67546cec1756b649c5d268335472c820a1e4ccd98d58cc5aa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-ljpnb" Jan 15 02:01:27.029474 kubelet[2935]: E0115 02:01:27.029370 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-ljpnb_kube-system(bf5cf22e-5674-4890-9d47-13e5c96cd4be)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-ljpnb_kube-system(bf5cf22e-5674-4890-9d47-13e5c96cd4be)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2dec75f773bd26b67546cec1756b649c5d268335472c820a1e4ccd98d58cc5aa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-ljpnb" podUID="bf5cf22e-5674-4890-9d47-13e5c96cd4be" Jan 15 02:01:27.035532 containerd[1710]: time="2026-01-15T02:01:27.035509217Z" level=error msg="Failed to destroy network for sandbox \"04c4e06735fec58f00535e1db361429cc6cca09d80bf89c931e9aad05e84586c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.035871 containerd[1710]: time="2026-01-15T02:01:27.035529576Z" level=error msg="Failed to destroy network for sandbox \"c99824e7ece05db622176d6d79be73cae69d60f479a146e806b43933b267c7bd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.042567 containerd[1710]: time="2026-01-15T02:01:27.042483662Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-5x8dj,Uid:7529b612-ddf2-4fb7-9823-720a3bd71760,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c99824e7ece05db622176d6d79be73cae69d60f479a146e806b43933b267c7bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.042663 kubelet[2935]: E0115 02:01:27.042635 2935 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c99824e7ece05db622176d6d79be73cae69d60f479a146e806b43933b267c7bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.042705 kubelet[2935]: E0115 02:01:27.042678 2935 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c99824e7ece05db622176d6d79be73cae69d60f479a146e806b43933b267c7bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-5x8dj" Jan 15 02:01:27.042705 kubelet[2935]: E0115 02:01:27.042698 2935 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c99824e7ece05db622176d6d79be73cae69d60f479a146e806b43933b267c7bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-5x8dj" Jan 15 02:01:27.042847 kubelet[2935]: E0115 02:01:27.042727 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-5x8dj_calico-system(7529b612-ddf2-4fb7-9823-720a3bd71760)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-5x8dj_calico-system(7529b612-ddf2-4fb7-9823-720a3bd71760)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c99824e7ece05db622176d6d79be73cae69d60f479a146e806b43933b267c7bd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-5x8dj" podUID="7529b612-ddf2-4fb7-9823-720a3bd71760" Jan 15 02:01:27.044312 containerd[1710]: time="2026-01-15T02:01:27.044287761Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hdqqp,Uid:328d556f-d445-4769-97a7-1a5530a232c4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"04c4e06735fec58f00535e1db361429cc6cca09d80bf89c931e9aad05e84586c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.044604 kubelet[2935]: E0115 02:01:27.044508 2935 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"04c4e06735fec58f00535e1db361429cc6cca09d80bf89c931e9aad05e84586c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.044604 kubelet[2935]: E0115 02:01:27.044565 2935 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"04c4e06735fec58f00535e1db361429cc6cca09d80bf89c931e9aad05e84586c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hdqqp" Jan 15 02:01:27.044604 kubelet[2935]: E0115 02:01:27.044579 2935 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"04c4e06735fec58f00535e1db361429cc6cca09d80bf89c931e9aad05e84586c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hdqqp" Jan 15 02:01:27.044720 kubelet[2935]: E0115 02:01:27.044622 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-hdqqp_calico-system(328d556f-d445-4769-97a7-1a5530a232c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-hdqqp_calico-system(328d556f-d445-4769-97a7-1a5530a232c4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"04c4e06735fec58f00535e1db361429cc6cca09d80bf89c931e9aad05e84586c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:01:27.045165 containerd[1710]: time="2026-01-15T02:01:27.045129260Z" level=error msg="Failed to destroy network for sandbox \"ac78e55ea5bf1b33e78156c524ef01a69fc7289bf98789a02a4a59fcc001fc3a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.049539 containerd[1710]: time="2026-01-15T02:01:27.049511933Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hpxzl,Uid:2ea7b673-cd9f-4a24-98cf-dc76e6ed54b4,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac78e55ea5bf1b33e78156c524ef01a69fc7289bf98789a02a4a59fcc001fc3a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.049670 kubelet[2935]: E0115 02:01:27.049627 2935 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac78e55ea5bf1b33e78156c524ef01a69fc7289bf98789a02a4a59fcc001fc3a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.049670 kubelet[2935]: E0115 02:01:27.049653 2935 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac78e55ea5bf1b33e78156c524ef01a69fc7289bf98789a02a4a59fcc001fc3a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-hpxzl" Jan 15 02:01:27.049746 kubelet[2935]: E0115 02:01:27.049670 2935 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac78e55ea5bf1b33e78156c524ef01a69fc7289bf98789a02a4a59fcc001fc3a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-hpxzl" Jan 15 02:01:27.049746 kubelet[2935]: E0115 02:01:27.049696 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-hpxzl_kube-system(2ea7b673-cd9f-4a24-98cf-dc76e6ed54b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-hpxzl_kube-system(2ea7b673-cd9f-4a24-98cf-dc76e6ed54b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ac78e55ea5bf1b33e78156c524ef01a69fc7289bf98789a02a4a59fcc001fc3a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-hpxzl" podUID="2ea7b673-cd9f-4a24-98cf-dc76e6ed54b4" Jan 15 02:01:27.053510 containerd[1710]: time="2026-01-15T02:01:27.053464117Z" level=error msg="Failed to destroy network for sandbox \"6d493b903e401783cf3f83f94ddb6ecee1ad1d1cc70900f69162e2f5e167e043\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.056774 containerd[1710]: time="2026-01-15T02:01:27.056744174Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b978c6cc9-jnxw2,Uid:85d7699e-6a0a-4fd1-96b0-9365b90d23ad,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d493b903e401783cf3f83f94ddb6ecee1ad1d1cc70900f69162e2f5e167e043\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.057009 kubelet[2935]: E0115 02:01:27.056986 2935 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d493b903e401783cf3f83f94ddb6ecee1ad1d1cc70900f69162e2f5e167e043\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.057079 kubelet[2935]: E0115 02:01:27.057068 2935 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d493b903e401783cf3f83f94ddb6ecee1ad1d1cc70900f69162e2f5e167e043\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" Jan 15 02:01:27.057162 kubelet[2935]: E0115 02:01:27.057121 2935 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d493b903e401783cf3f83f94ddb6ecee1ad1d1cc70900f69162e2f5e167e043\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" Jan 15 02:01:27.057223 kubelet[2935]: E0115 02:01:27.057207 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7b978c6cc9-jnxw2_calico-system(85d7699e-6a0a-4fd1-96b0-9365b90d23ad)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7b978c6cc9-jnxw2_calico-system(85d7699e-6a0a-4fd1-96b0-9365b90d23ad)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6d493b903e401783cf3f83f94ddb6ecee1ad1d1cc70900f69162e2f5e167e043\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" podUID="85d7699e-6a0a-4fd1-96b0-9365b90d23ad" Jan 15 02:01:27.311789 containerd[1710]: time="2026-01-15T02:01:27.311484312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-c87d4cc77-45zsr,Uid:05de6337-cbb1-44dd-97eb-3966ed3ddde0,Namespace:calico-system,Attempt:0,}" Jan 15 02:01:27.312448 containerd[1710]: time="2026-01-15T02:01:27.312374667Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dcd9489f8-9mlsx,Uid:19301f0b-a1cd-4987-b493-85e61d59a457,Namespace:calico-apiserver,Attempt:0,}" Jan 15 02:01:27.312752 containerd[1710]: time="2026-01-15T02:01:27.312683412Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dcd9489f8-x7ppt,Uid:768894fc-e95b-49e5-9a90-1487b94ce02a,Namespace:calico-apiserver,Attempt:0,}" Jan 15 02:01:27.405147 containerd[1710]: time="2026-01-15T02:01:27.405106533Z" level=error msg="Failed to destroy network for sandbox \"efc81352a121dce89da47ecea483bc3aedb252cd26bb7b7980906375e9d60bee\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.408969 containerd[1710]: time="2026-01-15T02:01:27.408843074Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-c87d4cc77-45zsr,Uid:05de6337-cbb1-44dd-97eb-3966ed3ddde0,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"efc81352a121dce89da47ecea483bc3aedb252cd26bb7b7980906375e9d60bee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.409573 kubelet[2935]: E0115 02:01:27.409041 2935 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efc81352a121dce89da47ecea483bc3aedb252cd26bb7b7980906375e9d60bee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.409573 kubelet[2935]: E0115 02:01:27.409098 2935 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efc81352a121dce89da47ecea483bc3aedb252cd26bb7b7980906375e9d60bee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-c87d4cc77-45zsr" Jan 15 02:01:27.409573 kubelet[2935]: E0115 02:01:27.409117 2935 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efc81352a121dce89da47ecea483bc3aedb252cd26bb7b7980906375e9d60bee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-c87d4cc77-45zsr" Jan 15 02:01:27.409853 kubelet[2935]: E0115 02:01:27.409409 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-c87d4cc77-45zsr_calico-system(05de6337-cbb1-44dd-97eb-3966ed3ddde0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-c87d4cc77-45zsr_calico-system(05de6337-cbb1-44dd-97eb-3966ed3ddde0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"efc81352a121dce89da47ecea483bc3aedb252cd26bb7b7980906375e9d60bee\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-c87d4cc77-45zsr" podUID="05de6337-cbb1-44dd-97eb-3966ed3ddde0" Jan 15 02:01:27.429255 containerd[1710]: time="2026-01-15T02:01:27.429230091Z" level=error msg="Failed to destroy network for sandbox \"a9770147c38a1ded234934f9efac774a5b164ece942fa2c5e10ba005e3f2cd8a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.433434 containerd[1710]: time="2026-01-15T02:01:27.433352443Z" level=error msg="Failed to destroy network for sandbox \"5abd37871eabba61f5ff11a199435eceba4610384573ed766042c345498b37cf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.433683 containerd[1710]: time="2026-01-15T02:01:27.433566839Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dcd9489f8-9mlsx,Uid:19301f0b-a1cd-4987-b493-85e61d59a457,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9770147c38a1ded234934f9efac774a5b164ece942fa2c5e10ba005e3f2cd8a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.434301 kubelet[2935]: E0115 02:01:27.433857 2935 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9770147c38a1ded234934f9efac774a5b164ece942fa2c5e10ba005e3f2cd8a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.434301 kubelet[2935]: E0115 02:01:27.433903 2935 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9770147c38a1ded234934f9efac774a5b164ece942fa2c5e10ba005e3f2cd8a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" Jan 15 02:01:27.434301 kubelet[2935]: E0115 02:01:27.433920 2935 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9770147c38a1ded234934f9efac774a5b164ece942fa2c5e10ba005e3f2cd8a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" Jan 15 02:01:27.434445 kubelet[2935]: E0115 02:01:27.433957 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5dcd9489f8-9mlsx_calico-apiserver(19301f0b-a1cd-4987-b493-85e61d59a457)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5dcd9489f8-9mlsx_calico-apiserver(19301f0b-a1cd-4987-b493-85e61d59a457)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a9770147c38a1ded234934f9efac774a5b164ece942fa2c5e10ba005e3f2cd8a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" podUID="19301f0b-a1cd-4987-b493-85e61d59a457" Jan 15 02:01:27.437376 containerd[1710]: time="2026-01-15T02:01:27.437338578Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dcd9489f8-x7ppt,Uid:768894fc-e95b-49e5-9a90-1487b94ce02a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5abd37871eabba61f5ff11a199435eceba4610384573ed766042c345498b37cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.437723 kubelet[2935]: E0115 02:01:27.437588 2935 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5abd37871eabba61f5ff11a199435eceba4610384573ed766042c345498b37cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 15 02:01:27.437723 kubelet[2935]: E0115 02:01:27.437643 2935 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5abd37871eabba61f5ff11a199435eceba4610384573ed766042c345498b37cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" Jan 15 02:01:27.437723 kubelet[2935]: E0115 02:01:27.437656 2935 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5abd37871eabba61f5ff11a199435eceba4610384573ed766042c345498b37cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" Jan 15 02:01:27.437907 kubelet[2935]: E0115 02:01:27.437683 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5dcd9489f8-x7ppt_calico-apiserver(768894fc-e95b-49e5-9a90-1487b94ce02a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5dcd9489f8-x7ppt_calico-apiserver(768894fc-e95b-49e5-9a90-1487b94ce02a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5abd37871eabba61f5ff11a199435eceba4610384573ed766042c345498b37cf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" podUID="768894fc-e95b-49e5-9a90-1487b94ce02a" Jan 15 02:01:27.782755 systemd[1]: run-netns-cni\x2d6768c164\x2d5252\x2de797\x2d4b23\x2d52f565659f82.mount: Deactivated successfully. Jan 15 02:01:27.782978 systemd[1]: run-netns-cni\x2d516d75bd\x2d4a1a\x2d9039\x2db251\x2d9010225c6669.mount: Deactivated successfully. Jan 15 02:01:27.783132 systemd[1]: run-netns-cni\x2d989635df\x2d6c50\x2db389\x2d6465\x2dd85e210544cb.mount: Deactivated successfully. Jan 15 02:01:36.123513 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3347290336.mount: Deactivated successfully. Jan 15 02:01:36.295609 containerd[1710]: time="2026-01-15T02:01:36.295500019Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:01:36.297793 containerd[1710]: time="2026-01-15T02:01:36.297370971Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156880025" Jan 15 02:01:36.299813 containerd[1710]: time="2026-01-15T02:01:36.299704846Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:01:36.305457 containerd[1710]: time="2026-01-15T02:01:36.305392845Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 15 02:01:36.306973 containerd[1710]: time="2026-01-15T02:01:36.306914031Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 9.374991987s" Jan 15 02:01:36.307183 containerd[1710]: time="2026-01-15T02:01:36.307129248Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Jan 15 02:01:36.334632 containerd[1710]: time="2026-01-15T02:01:36.334572148Z" level=info msg="CreateContainer within sandbox \"767b33bc1a2bdfa0e4a2648c65f7143f37c973fd663de3ea14aa87302928e144\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 15 02:01:36.358530 containerd[1710]: time="2026-01-15T02:01:36.358465480Z" level=info msg="Container 117d30a31bcb0f72e0818634e9814721dbb29f8c524c9b1e0ac78e2cfe8d94cd: CDI devices from CRI Config.CDIDevices: []" Jan 15 02:01:36.379014 containerd[1710]: time="2026-01-15T02:01:36.378754707Z" level=info msg="CreateContainer within sandbox \"767b33bc1a2bdfa0e4a2648c65f7143f37c973fd663de3ea14aa87302928e144\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"117d30a31bcb0f72e0818634e9814721dbb29f8c524c9b1e0ac78e2cfe8d94cd\"" Jan 15 02:01:36.380517 containerd[1710]: time="2026-01-15T02:01:36.380435984Z" level=info msg="StartContainer for \"117d30a31bcb0f72e0818634e9814721dbb29f8c524c9b1e0ac78e2cfe8d94cd\"" Jan 15 02:01:36.383882 containerd[1710]: time="2026-01-15T02:01:36.383801250Z" level=info msg="connecting to shim 117d30a31bcb0f72e0818634e9814721dbb29f8c524c9b1e0ac78e2cfe8d94cd" address="unix:///run/containerd/s/bbfaede1380bd71675d129843fde107fac543ba8bb5b335042a48815bd8a3707" protocol=ttrpc version=3 Jan 15 02:01:36.420414 systemd[1]: Started cri-containerd-117d30a31bcb0f72e0818634e9814721dbb29f8c524c9b1e0ac78e2cfe8d94cd.scope - libcontainer container 117d30a31bcb0f72e0818634e9814721dbb29f8c524c9b1e0ac78e2cfe8d94cd. Jan 15 02:01:36.510000 audit: BPF prog-id=172 op=LOAD Jan 15 02:01:36.513338 kernel: kauditd_printk_skb: 12 callbacks suppressed Jan 15 02:01:36.513442 kernel: audit: type=1334 audit(1768442496.510:581): prog-id=172 op=LOAD Jan 15 02:01:36.510000 audit[3936]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3437 pid=3936 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:36.510000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131376433306133316263623066373265303831383633346539383134 Jan 15 02:01:36.539904 kernel: audit: type=1300 audit(1768442496.510:581): arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=3437 pid=3936 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:36.540194 kernel: audit: type=1327 audit(1768442496.510:581): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131376433306133316263623066373265303831383633346539383134 Jan 15 02:01:36.517000 audit: BPF prog-id=173 op=LOAD Jan 15 02:01:36.548116 kernel: audit: type=1334 audit(1768442496.517:582): prog-id=173 op=LOAD Jan 15 02:01:36.517000 audit[3936]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3437 pid=3936 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:36.553685 kernel: audit: type=1300 audit(1768442496.517:582): arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=3437 pid=3936 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:36.517000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131376433306133316263623066373265303831383633346539383134 Jan 15 02:01:36.517000 audit: BPF prog-id=173 op=UNLOAD Jan 15 02:01:36.568754 kernel: audit: type=1327 audit(1768442496.517:582): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131376433306133316263623066373265303831383633346539383134 Jan 15 02:01:36.569041 kernel: audit: type=1334 audit(1768442496.517:583): prog-id=173 op=UNLOAD Jan 15 02:01:36.517000 audit[3936]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3936 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:36.573094 kernel: audit: type=1300 audit(1768442496.517:583): arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3936 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:36.517000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131376433306133316263623066373265303831383633346539383134 Jan 15 02:01:36.579102 kernel: audit: type=1327 audit(1768442496.517:583): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131376433306133316263623066373265303831383633346539383134 Jan 15 02:01:36.517000 audit: BPF prog-id=172 op=UNLOAD Jan 15 02:01:36.583278 kernel: audit: type=1334 audit(1768442496.517:584): prog-id=172 op=UNLOAD Jan 15 02:01:36.517000 audit[3936]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3936 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:36.517000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131376433306133316263623066373265303831383633346539383134 Jan 15 02:01:36.518000 audit: BPF prog-id=174 op=LOAD Jan 15 02:01:36.518000 audit[3936]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=3437 pid=3936 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:36.518000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131376433306133316263623066373265303831383633346539383134 Jan 15 02:01:36.608709 containerd[1710]: time="2026-01-15T02:01:36.608674908Z" level=info msg="StartContainer for \"117d30a31bcb0f72e0818634e9814721dbb29f8c524c9b1e0ac78e2cfe8d94cd\" returns successfully" Jan 15 02:01:36.727003 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 15 02:01:36.727121 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 15 02:01:36.925540 kubelet[2935]: I0115 02:01:36.925491 2935 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05de6337-cbb1-44dd-97eb-3966ed3ddde0-whisker-ca-bundle\") pod \"05de6337-cbb1-44dd-97eb-3966ed3ddde0\" (UID: \"05de6337-cbb1-44dd-97eb-3966ed3ddde0\") " Jan 15 02:01:36.925540 kubelet[2935]: I0115 02:01:36.925533 2935 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/05de6337-cbb1-44dd-97eb-3966ed3ddde0-whisker-backend-key-pair\") pod \"05de6337-cbb1-44dd-97eb-3966ed3ddde0\" (UID: \"05de6337-cbb1-44dd-97eb-3966ed3ddde0\") " Jan 15 02:01:36.926372 kubelet[2935]: I0115 02:01:36.925562 2935 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vczts\" (UniqueName: \"kubernetes.io/projected/05de6337-cbb1-44dd-97eb-3966ed3ddde0-kube-api-access-vczts\") pod \"05de6337-cbb1-44dd-97eb-3966ed3ddde0\" (UID: \"05de6337-cbb1-44dd-97eb-3966ed3ddde0\") " Jan 15 02:01:36.926372 kubelet[2935]: I0115 02:01:36.926050 2935 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05de6337-cbb1-44dd-97eb-3966ed3ddde0-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "05de6337-cbb1-44dd-97eb-3966ed3ddde0" (UID: "05de6337-cbb1-44dd-97eb-3966ed3ddde0"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 15 02:01:36.930677 kubelet[2935]: I0115 02:01:36.930638 2935 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05de6337-cbb1-44dd-97eb-3966ed3ddde0-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "05de6337-cbb1-44dd-97eb-3966ed3ddde0" (UID: "05de6337-cbb1-44dd-97eb-3966ed3ddde0"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 15 02:01:36.931344 kubelet[2935]: I0115 02:01:36.931319 2935 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05de6337-cbb1-44dd-97eb-3966ed3ddde0-kube-api-access-vczts" (OuterVolumeSpecName: "kube-api-access-vczts") pod "05de6337-cbb1-44dd-97eb-3966ed3ddde0" (UID: "05de6337-cbb1-44dd-97eb-3966ed3ddde0"). InnerVolumeSpecName "kube-api-access-vczts". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 15 02:01:36.973889 systemd[1]: Removed slice kubepods-besteffort-pod05de6337_cbb1_44dd_97eb_3966ed3ddde0.slice - libcontainer container kubepods-besteffort-pod05de6337_cbb1_44dd_97eb_3966ed3ddde0.slice. Jan 15 02:01:36.984059 kubelet[2935]: I0115 02:01:36.983912 2935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-klwkl" podStartSLOduration=1.976946096 podStartE2EDuration="24.983898427s" podCreationTimestamp="2026-01-15 02:01:12 +0000 UTC" firstStartedPulling="2026-01-15 02:01:13.301878053 +0000 UTC m=+29.679869876" lastFinishedPulling="2026-01-15 02:01:36.308830298 +0000 UTC m=+52.686822207" observedRunningTime="2026-01-15 02:01:36.983215851 +0000 UTC m=+53.361207684" watchObservedRunningTime="2026-01-15 02:01:36.983898427 +0000 UTC m=+53.361890263" Jan 15 02:01:37.025840 kubelet[2935]: I0115 02:01:37.025793 2935 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05de6337-cbb1-44dd-97eb-3966ed3ddde0-whisker-ca-bundle\") on node \"ci-4515-1-0-n-e5e35ee394\" DevicePath \"\"" Jan 15 02:01:37.025840 kubelet[2935]: I0115 02:01:37.025816 2935 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/05de6337-cbb1-44dd-97eb-3966ed3ddde0-whisker-backend-key-pair\") on node \"ci-4515-1-0-n-e5e35ee394\" DevicePath \"\"" Jan 15 02:01:37.025840 kubelet[2935]: I0115 02:01:37.025835 2935 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vczts\" (UniqueName: \"kubernetes.io/projected/05de6337-cbb1-44dd-97eb-3966ed3ddde0-kube-api-access-vczts\") on node \"ci-4515-1-0-n-e5e35ee394\" DevicePath \"\"" Jan 15 02:01:37.032869 systemd[1]: Created slice kubepods-besteffort-pod0839e1a2_38f7_4739_be41_8f605565d9d2.slice - libcontainer container kubepods-besteffort-pod0839e1a2_38f7_4739_be41_8f605565d9d2.slice. Jan 15 02:01:37.126595 kubelet[2935]: I0115 02:01:37.126319 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0839e1a2-38f7-4739-be41-8f605565d9d2-whisker-backend-key-pair\") pod \"whisker-5cd9b9db69-tblzt\" (UID: \"0839e1a2-38f7-4739-be41-8f605565d9d2\") " pod="calico-system/whisker-5cd9b9db69-tblzt" Jan 15 02:01:37.126726 kubelet[2935]: I0115 02:01:37.126606 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0839e1a2-38f7-4739-be41-8f605565d9d2-whisker-ca-bundle\") pod \"whisker-5cd9b9db69-tblzt\" (UID: \"0839e1a2-38f7-4739-be41-8f605565d9d2\") " pod="calico-system/whisker-5cd9b9db69-tblzt" Jan 15 02:01:37.126871 kubelet[2935]: I0115 02:01:37.126835 2935 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swvrj\" (UniqueName: \"kubernetes.io/projected/0839e1a2-38f7-4739-be41-8f605565d9d2-kube-api-access-swvrj\") pod \"whisker-5cd9b9db69-tblzt\" (UID: \"0839e1a2-38f7-4739-be41-8f605565d9d2\") " pod="calico-system/whisker-5cd9b9db69-tblzt" Jan 15 02:01:37.127017 systemd[1]: var-lib-kubelet-pods-05de6337\x2dcbb1\x2d44dd\x2d97eb\x2d3966ed3ddde0-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 15 02:01:37.127474 systemd[1]: var-lib-kubelet-pods-05de6337\x2dcbb1\x2d44dd\x2d97eb\x2d3966ed3ddde0-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dvczts.mount: Deactivated successfully. Jan 15 02:01:37.336896 containerd[1710]: time="2026-01-15T02:01:37.336811888Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5cd9b9db69-tblzt,Uid:0839e1a2-38f7-4739-be41-8f605565d9d2,Namespace:calico-system,Attempt:0,}" Jan 15 02:01:37.682274 systemd-networkd[1600]: cali96e36ad5ab4: Link UP Jan 15 02:01:37.684064 systemd-networkd[1600]: cali96e36ad5ab4: Gained carrier Jan 15 02:01:37.711669 containerd[1710]: 2026-01-15 02:01:37.395 [INFO][4002] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 15 02:01:37.711669 containerd[1710]: 2026-01-15 02:01:37.517 [INFO][4002] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--e5e35ee394-k8s-whisker--5cd9b9db69--tblzt-eth0 whisker-5cd9b9db69- calico-system 0839e1a2-38f7-4739-be41-8f605565d9d2 917 0 2026-01-15 02:01:37 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5cd9b9db69 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4515-1-0-n-e5e35ee394 whisker-5cd9b9db69-tblzt eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali96e36ad5ab4 [] [] }} ContainerID="f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" Namespace="calico-system" Pod="whisker-5cd9b9db69-tblzt" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-whisker--5cd9b9db69--tblzt-" Jan 15 02:01:37.711669 containerd[1710]: 2026-01-15 02:01:37.519 [INFO][4002] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" Namespace="calico-system" Pod="whisker-5cd9b9db69-tblzt" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-whisker--5cd9b9db69--tblzt-eth0" Jan 15 02:01:37.711669 containerd[1710]: 2026-01-15 02:01:37.588 [INFO][4013] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" HandleID="k8s-pod-network.f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" Workload="ci--4515--1--0--n--e5e35ee394-k8s-whisker--5cd9b9db69--tblzt-eth0" Jan 15 02:01:37.712044 containerd[1710]: 2026-01-15 02:01:37.589 [INFO][4013] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" HandleID="k8s-pod-network.f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" Workload="ci--4515--1--0--n--e5e35ee394-k8s-whisker--5cd9b9db69--tblzt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5a10), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-n-e5e35ee394", "pod":"whisker-5cd9b9db69-tblzt", "timestamp":"2026-01-15 02:01:37.588611566 +0000 UTC"}, Hostname:"ci-4515-1-0-n-e5e35ee394", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 02:01:37.712044 containerd[1710]: 2026-01-15 02:01:37.590 [INFO][4013] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 02:01:37.712044 containerd[1710]: 2026-01-15 02:01:37.590 [INFO][4013] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 02:01:37.712044 containerd[1710]: 2026-01-15 02:01:37.590 [INFO][4013] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-e5e35ee394' Jan 15 02:01:37.712044 containerd[1710]: 2026-01-15 02:01:37.609 [INFO][4013] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.712044 containerd[1710]: 2026-01-15 02:01:37.616 [INFO][4013] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.712044 containerd[1710]: 2026-01-15 02:01:37.624 [INFO][4013] ipam/ipam.go 511: Trying affinity for 192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.712044 containerd[1710]: 2026-01-15 02:01:37.627 [INFO][4013] ipam/ipam.go 158: Attempting to load block cidr=192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.712044 containerd[1710]: 2026-01-15 02:01:37.631 [INFO][4013] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.712761 containerd[1710]: 2026-01-15 02:01:37.631 [INFO][4013] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.92.64/26 handle="k8s-pod-network.f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.712761 containerd[1710]: 2026-01-15 02:01:37.633 [INFO][4013] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530 Jan 15 02:01:37.712761 containerd[1710]: 2026-01-15 02:01:37.641 [INFO][4013] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.92.64/26 handle="k8s-pod-network.f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.712761 containerd[1710]: 2026-01-15 02:01:37.649 [INFO][4013] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.92.65/26] block=192.168.92.64/26 handle="k8s-pod-network.f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.712761 containerd[1710]: 2026-01-15 02:01:37.649 [INFO][4013] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.92.65/26] handle="k8s-pod-network.f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.712761 containerd[1710]: 2026-01-15 02:01:37.649 [INFO][4013] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 02:01:37.712761 containerd[1710]: 2026-01-15 02:01:37.649 [INFO][4013] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.92.65/26] IPv6=[] ContainerID="f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" HandleID="k8s-pod-network.f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" Workload="ci--4515--1--0--n--e5e35ee394-k8s-whisker--5cd9b9db69--tblzt-eth0" Jan 15 02:01:37.713268 containerd[1710]: 2026-01-15 02:01:37.658 [INFO][4002] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" Namespace="calico-system" Pod="whisker-5cd9b9db69-tblzt" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-whisker--5cd9b9db69--tblzt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--e5e35ee394-k8s-whisker--5cd9b9db69--tblzt-eth0", GenerateName:"whisker-5cd9b9db69-", Namespace:"calico-system", SelfLink:"", UID:"0839e1a2-38f7-4739-be41-8f605565d9d2", ResourceVersion:"917", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 2, 1, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5cd9b9db69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-e5e35ee394", ContainerID:"", Pod:"whisker-5cd9b9db69-tblzt", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.92.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali96e36ad5ab4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 02:01:37.713268 containerd[1710]: 2026-01-15 02:01:37.658 [INFO][4002] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.92.65/32] ContainerID="f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" Namespace="calico-system" Pod="whisker-5cd9b9db69-tblzt" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-whisker--5cd9b9db69--tblzt-eth0" Jan 15 02:01:37.713516 containerd[1710]: 2026-01-15 02:01:37.659 [INFO][4002] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali96e36ad5ab4 ContainerID="f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" Namespace="calico-system" Pod="whisker-5cd9b9db69-tblzt" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-whisker--5cd9b9db69--tblzt-eth0" Jan 15 02:01:37.713516 containerd[1710]: 2026-01-15 02:01:37.685 [INFO][4002] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" Namespace="calico-system" Pod="whisker-5cd9b9db69-tblzt" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-whisker--5cd9b9db69--tblzt-eth0" Jan 15 02:01:37.713615 containerd[1710]: 2026-01-15 02:01:37.687 [INFO][4002] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" Namespace="calico-system" Pod="whisker-5cd9b9db69-tblzt" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-whisker--5cd9b9db69--tblzt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--e5e35ee394-k8s-whisker--5cd9b9db69--tblzt-eth0", GenerateName:"whisker-5cd9b9db69-", Namespace:"calico-system", SelfLink:"", UID:"0839e1a2-38f7-4739-be41-8f605565d9d2", ResourceVersion:"917", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 2, 1, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5cd9b9db69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-e5e35ee394", ContainerID:"f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530", Pod:"whisker-5cd9b9db69-tblzt", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.92.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali96e36ad5ab4", MAC:"96:c0:d6:40:4d:be", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 02:01:37.713764 containerd[1710]: 2026-01-15 02:01:37.706 [INFO][4002] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" Namespace="calico-system" Pod="whisker-5cd9b9db69-tblzt" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-whisker--5cd9b9db69--tblzt-eth0" Jan 15 02:01:37.738318 containerd[1710]: time="2026-01-15T02:01:37.738045861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b978c6cc9-jnxw2,Uid:85d7699e-6a0a-4fd1-96b0-9365b90d23ad,Namespace:calico-system,Attempt:0,}" Jan 15 02:01:37.739298 containerd[1710]: time="2026-01-15T02:01:37.739223633Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ljpnb,Uid:bf5cf22e-5674-4890-9d47-13e5c96cd4be,Namespace:kube-system,Attempt:0,}" Jan 15 02:01:37.744997 kubelet[2935]: I0115 02:01:37.744454 2935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05de6337-cbb1-44dd-97eb-3966ed3ddde0" path="/var/lib/kubelet/pods/05de6337-cbb1-44dd-97eb-3966ed3ddde0/volumes" Jan 15 02:01:37.785020 containerd[1710]: time="2026-01-15T02:01:37.784989952Z" level=info msg="connecting to shim f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530" address="unix:///run/containerd/s/9bcf0da1ad4e210fed06fa627fb4d1754886878879ab6f6858cae5d10ab37573" namespace=k8s.io protocol=ttrpc version=3 Jan 15 02:01:37.827407 systemd[1]: Started cri-containerd-f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530.scope - libcontainer container f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530. Jan 15 02:01:37.844000 audit: BPF prog-id=175 op=LOAD Jan 15 02:01:37.844000 audit: BPF prog-id=176 op=LOAD Jan 15 02:01:37.844000 audit[4068]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4056 pid=4068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:37.844000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637633538636662333164343336393139343633306631663839613333 Jan 15 02:01:37.845000 audit: BPF prog-id=176 op=UNLOAD Jan 15 02:01:37.845000 audit[4068]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4056 pid=4068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:37.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637633538636662333164343336393139343633306631663839613333 Jan 15 02:01:37.845000 audit: BPF prog-id=177 op=LOAD Jan 15 02:01:37.845000 audit[4068]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4056 pid=4068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:37.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637633538636662333164343336393139343633306631663839613333 Jan 15 02:01:37.846000 audit: BPF prog-id=178 op=LOAD Jan 15 02:01:37.846000 audit[4068]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=4056 pid=4068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:37.846000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637633538636662333164343336393139343633306631663839613333 Jan 15 02:01:37.846000 audit: BPF prog-id=178 op=UNLOAD Jan 15 02:01:37.846000 audit[4068]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4056 pid=4068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:37.846000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637633538636662333164343336393139343633306631663839613333 Jan 15 02:01:37.846000 audit: BPF prog-id=177 op=UNLOAD Jan 15 02:01:37.846000 audit[4068]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4056 pid=4068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:37.846000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637633538636662333164343336393139343633306631663839613333 Jan 15 02:01:37.846000 audit: BPF prog-id=179 op=LOAD Jan 15 02:01:37.846000 audit[4068]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=4056 pid=4068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:37.846000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637633538636662333164343336393139343633306631663839613333 Jan 15 02:01:37.913248 systemd-networkd[1600]: cali29d17085732: Link UP Jan 15 02:01:37.913545 systemd-networkd[1600]: cali29d17085732: Gained carrier Jan 15 02:01:37.926139 containerd[1710]: 2026-01-15 02:01:37.822 [INFO][4028] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 15 02:01:37.926139 containerd[1710]: 2026-01-15 02:01:37.841 [INFO][4028] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--e5e35ee394-k8s-calico--kube--controllers--7b978c6cc9--jnxw2-eth0 calico-kube-controllers-7b978c6cc9- calico-system 85d7699e-6a0a-4fd1-96b0-9365b90d23ad 846 0 2026-01-15 02:01:13 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7b978c6cc9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4515-1-0-n-e5e35ee394 calico-kube-controllers-7b978c6cc9-jnxw2 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali29d17085732 [] [] }} ContainerID="16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" Namespace="calico-system" Pod="calico-kube-controllers-7b978c6cc9-jnxw2" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--kube--controllers--7b978c6cc9--jnxw2-" Jan 15 02:01:37.926139 containerd[1710]: 2026-01-15 02:01:37.841 [INFO][4028] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" Namespace="calico-system" Pod="calico-kube-controllers-7b978c6cc9-jnxw2" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--kube--controllers--7b978c6cc9--jnxw2-eth0" Jan 15 02:01:37.926139 containerd[1710]: 2026-01-15 02:01:37.871 [INFO][4098] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" HandleID="k8s-pod-network.16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" Workload="ci--4515--1--0--n--e5e35ee394-k8s-calico--kube--controllers--7b978c6cc9--jnxw2-eth0" Jan 15 02:01:37.926480 containerd[1710]: 2026-01-15 02:01:37.871 [INFO][4098] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" HandleID="k8s-pod-network.16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" Workload="ci--4515--1--0--n--e5e35ee394-k8s-calico--kube--controllers--7b978c6cc9--jnxw2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c55a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-n-e5e35ee394", "pod":"calico-kube-controllers-7b978c6cc9-jnxw2", "timestamp":"2026-01-15 02:01:37.871702144 +0000 UTC"}, Hostname:"ci-4515-1-0-n-e5e35ee394", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 02:01:37.926480 containerd[1710]: 2026-01-15 02:01:37.871 [INFO][4098] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 02:01:37.926480 containerd[1710]: 2026-01-15 02:01:37.872 [INFO][4098] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 02:01:37.926480 containerd[1710]: 2026-01-15 02:01:37.872 [INFO][4098] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-e5e35ee394' Jan 15 02:01:37.926480 containerd[1710]: 2026-01-15 02:01:37.879 [INFO][4098] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.926480 containerd[1710]: 2026-01-15 02:01:37.882 [INFO][4098] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.926480 containerd[1710]: 2026-01-15 02:01:37.885 [INFO][4098] ipam/ipam.go 511: Trying affinity for 192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.926480 containerd[1710]: 2026-01-15 02:01:37.887 [INFO][4098] ipam/ipam.go 158: Attempting to load block cidr=192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.926480 containerd[1710]: 2026-01-15 02:01:37.888 [INFO][4098] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.926681 containerd[1710]: 2026-01-15 02:01:37.888 [INFO][4098] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.92.64/26 handle="k8s-pod-network.16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.926681 containerd[1710]: 2026-01-15 02:01:37.889 [INFO][4098] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec Jan 15 02:01:37.926681 containerd[1710]: 2026-01-15 02:01:37.895 [INFO][4098] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.92.64/26 handle="k8s-pod-network.16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.926681 containerd[1710]: 2026-01-15 02:01:37.906 [INFO][4098] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.92.66/26] block=192.168.92.64/26 handle="k8s-pod-network.16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.926681 containerd[1710]: 2026-01-15 02:01:37.906 [INFO][4098] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.92.66/26] handle="k8s-pod-network.16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:37.926681 containerd[1710]: 2026-01-15 02:01:37.906 [INFO][4098] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 02:01:37.926681 containerd[1710]: 2026-01-15 02:01:37.906 [INFO][4098] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.92.66/26] IPv6=[] ContainerID="16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" HandleID="k8s-pod-network.16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" Workload="ci--4515--1--0--n--e5e35ee394-k8s-calico--kube--controllers--7b978c6cc9--jnxw2-eth0" Jan 15 02:01:37.926813 containerd[1710]: 2026-01-15 02:01:37.910 [INFO][4028] cni-plugin/k8s.go 418: Populated endpoint ContainerID="16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" Namespace="calico-system" Pod="calico-kube-controllers-7b978c6cc9-jnxw2" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--kube--controllers--7b978c6cc9--jnxw2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--e5e35ee394-k8s-calico--kube--controllers--7b978c6cc9--jnxw2-eth0", GenerateName:"calico-kube-controllers-7b978c6cc9-", Namespace:"calico-system", SelfLink:"", UID:"85d7699e-6a0a-4fd1-96b0-9365b90d23ad", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 2, 1, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7b978c6cc9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-e5e35ee394", ContainerID:"", Pod:"calico-kube-controllers-7b978c6cc9-jnxw2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.92.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali29d17085732", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 02:01:37.926867 containerd[1710]: 2026-01-15 02:01:37.911 [INFO][4028] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.92.66/32] ContainerID="16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" Namespace="calico-system" Pod="calico-kube-controllers-7b978c6cc9-jnxw2" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--kube--controllers--7b978c6cc9--jnxw2-eth0" Jan 15 02:01:37.926867 containerd[1710]: 2026-01-15 02:01:37.911 [INFO][4028] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali29d17085732 ContainerID="16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" Namespace="calico-system" Pod="calico-kube-controllers-7b978c6cc9-jnxw2" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--kube--controllers--7b978c6cc9--jnxw2-eth0" Jan 15 02:01:37.926867 containerd[1710]: 2026-01-15 02:01:37.914 [INFO][4028] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" Namespace="calico-system" Pod="calico-kube-controllers-7b978c6cc9-jnxw2" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--kube--controllers--7b978c6cc9--jnxw2-eth0" Jan 15 02:01:37.926929 containerd[1710]: 2026-01-15 02:01:37.914 [INFO][4028] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" Namespace="calico-system" Pod="calico-kube-controllers-7b978c6cc9-jnxw2" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--kube--controllers--7b978c6cc9--jnxw2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--e5e35ee394-k8s-calico--kube--controllers--7b978c6cc9--jnxw2-eth0", GenerateName:"calico-kube-controllers-7b978c6cc9-", Namespace:"calico-system", SelfLink:"", UID:"85d7699e-6a0a-4fd1-96b0-9365b90d23ad", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 2, 1, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7b978c6cc9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-e5e35ee394", ContainerID:"16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec", Pod:"calico-kube-controllers-7b978c6cc9-jnxw2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.92.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali29d17085732", MAC:"3e:bb:03:4b:de:71", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 02:01:37.926980 containerd[1710]: 2026-01-15 02:01:37.923 [INFO][4028] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" Namespace="calico-system" Pod="calico-kube-controllers-7b978c6cc9-jnxw2" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--kube--controllers--7b978c6cc9--jnxw2-eth0" Jan 15 02:01:38.014954 systemd-networkd[1600]: cali5556ebaadbc: Link UP Jan 15 02:01:38.015627 systemd-networkd[1600]: cali5556ebaadbc: Gained carrier Jan 15 02:01:38.028318 containerd[1710]: 2026-01-15 02:01:37.801 [INFO][4035] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 15 02:01:38.028318 containerd[1710]: 2026-01-15 02:01:37.818 [INFO][4035] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--ljpnb-eth0 coredns-668d6bf9bc- kube-system bf5cf22e-5674-4890-9d47-13e5c96cd4be 854 0 2026-01-15 02:00:50 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4515-1-0-n-e5e35ee394 coredns-668d6bf9bc-ljpnb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali5556ebaadbc [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" Namespace="kube-system" Pod="coredns-668d6bf9bc-ljpnb" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--ljpnb-" Jan 15 02:01:38.028318 containerd[1710]: 2026-01-15 02:01:37.818 [INFO][4035] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" Namespace="kube-system" Pod="coredns-668d6bf9bc-ljpnb" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--ljpnb-eth0" Jan 15 02:01:38.028318 containerd[1710]: 2026-01-15 02:01:37.875 [INFO][4083] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" HandleID="k8s-pod-network.c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" Workload="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--ljpnb-eth0" Jan 15 02:01:38.028529 containerd[1710]: 2026-01-15 02:01:37.875 [INFO][4083] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" HandleID="k8s-pod-network.c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" Workload="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--ljpnb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d55d0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4515-1-0-n-e5e35ee394", "pod":"coredns-668d6bf9bc-ljpnb", "timestamp":"2026-01-15 02:01:37.875186854 +0000 UTC"}, Hostname:"ci-4515-1-0-n-e5e35ee394", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 02:01:38.028529 containerd[1710]: 2026-01-15 02:01:37.875 [INFO][4083] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 02:01:38.028529 containerd[1710]: 2026-01-15 02:01:37.906 [INFO][4083] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 02:01:38.028529 containerd[1710]: 2026-01-15 02:01:37.906 [INFO][4083] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-e5e35ee394' Jan 15 02:01:38.028529 containerd[1710]: 2026-01-15 02:01:37.980 [INFO][4083] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:38.028529 containerd[1710]: 2026-01-15 02:01:37.985 [INFO][4083] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:38.028529 containerd[1710]: 2026-01-15 02:01:37.990 [INFO][4083] ipam/ipam.go 511: Trying affinity for 192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:38.028529 containerd[1710]: 2026-01-15 02:01:37.993 [INFO][4083] ipam/ipam.go 158: Attempting to load block cidr=192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:38.028529 containerd[1710]: 2026-01-15 02:01:37.995 [INFO][4083] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:38.028772 containerd[1710]: 2026-01-15 02:01:37.995 [INFO][4083] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.92.64/26 handle="k8s-pod-network.c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:38.028772 containerd[1710]: 2026-01-15 02:01:37.996 [INFO][4083] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a Jan 15 02:01:38.028772 containerd[1710]: 2026-01-15 02:01:38.002 [INFO][4083] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.92.64/26 handle="k8s-pod-network.c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:38.028772 containerd[1710]: 2026-01-15 02:01:38.008 [INFO][4083] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.92.67/26] block=192.168.92.64/26 handle="k8s-pod-network.c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:38.028772 containerd[1710]: 2026-01-15 02:01:38.008 [INFO][4083] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.92.67/26] handle="k8s-pod-network.c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:38.028772 containerd[1710]: 2026-01-15 02:01:38.008 [INFO][4083] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 02:01:38.028772 containerd[1710]: 2026-01-15 02:01:38.008 [INFO][4083] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.92.67/26] IPv6=[] ContainerID="c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" HandleID="k8s-pod-network.c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" Workload="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--ljpnb-eth0" Jan 15 02:01:38.029000 containerd[1710]: 2026-01-15 02:01:38.010 [INFO][4035] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" Namespace="kube-system" Pod="coredns-668d6bf9bc-ljpnb" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--ljpnb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--ljpnb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"bf5cf22e-5674-4890-9d47-13e5c96cd4be", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 2, 0, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-e5e35ee394", ContainerID:"", Pod:"coredns-668d6bf9bc-ljpnb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.92.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5556ebaadbc", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 02:01:38.029000 containerd[1710]: 2026-01-15 02:01:38.010 [INFO][4035] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.92.67/32] ContainerID="c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" Namespace="kube-system" Pod="coredns-668d6bf9bc-ljpnb" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--ljpnb-eth0" Jan 15 02:01:38.029000 containerd[1710]: 2026-01-15 02:01:38.010 [INFO][4035] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5556ebaadbc ContainerID="c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" Namespace="kube-system" Pod="coredns-668d6bf9bc-ljpnb" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--ljpnb-eth0" Jan 15 02:01:38.029000 containerd[1710]: 2026-01-15 02:01:38.015 [INFO][4035] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" Namespace="kube-system" Pod="coredns-668d6bf9bc-ljpnb" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--ljpnb-eth0" Jan 15 02:01:38.029000 containerd[1710]: 2026-01-15 02:01:38.016 [INFO][4035] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" Namespace="kube-system" Pod="coredns-668d6bf9bc-ljpnb" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--ljpnb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--ljpnb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"bf5cf22e-5674-4890-9d47-13e5c96cd4be", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 2, 0, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-e5e35ee394", ContainerID:"c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a", Pod:"coredns-668d6bf9bc-ljpnb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.92.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5556ebaadbc", MAC:"86:cd:06:88:a8:10", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 02:01:38.029000 containerd[1710]: 2026-01-15 02:01:38.024 [INFO][4035] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" Namespace="kube-system" Pod="coredns-668d6bf9bc-ljpnb" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--ljpnb-eth0" Jan 15 02:01:38.176765 containerd[1710]: time="2026-01-15T02:01:38.176621537Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5cd9b9db69-tblzt,Uid:0839e1a2-38f7-4739-be41-8f605565d9d2,Namespace:calico-system,Attempt:0,} returns sandbox id \"f7c58cfb31d4369194630f1f89a33da2f7c33a645cc0842f49f75820f99cc530\"" Jan 15 02:01:38.182263 containerd[1710]: time="2026-01-15T02:01:38.182071175Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 02:01:38.600000 audit: BPF prog-id=180 op=LOAD Jan 15 02:01:38.600000 audit[4241]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffed78469d0 a2=98 a3=1fffffffffffffff items=0 ppid=4186 pid=4241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.600000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 02:01:38.601000 audit: BPF prog-id=180 op=UNLOAD Jan 15 02:01:38.601000 audit[4241]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffed78469a0 a3=0 items=0 ppid=4186 pid=4241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.601000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 02:01:38.601000 audit: BPF prog-id=181 op=LOAD Jan 15 02:01:38.601000 audit[4241]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffed78468b0 a2=94 a3=3 items=0 ppid=4186 pid=4241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.601000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 02:01:38.601000 audit: BPF prog-id=181 op=UNLOAD Jan 15 02:01:38.601000 audit[4241]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffed78468b0 a2=94 a3=3 items=0 ppid=4186 pid=4241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.601000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 02:01:38.601000 audit: BPF prog-id=182 op=LOAD Jan 15 02:01:38.601000 audit[4241]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffed78468f0 a2=94 a3=7ffed7846ad0 items=0 ppid=4186 pid=4241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.601000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 02:01:38.601000 audit: BPF prog-id=182 op=UNLOAD Jan 15 02:01:38.601000 audit[4241]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffed78468f0 a2=94 a3=7ffed7846ad0 items=0 ppid=4186 pid=4241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.601000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 15 02:01:38.603000 audit: BPF prog-id=183 op=LOAD Jan 15 02:01:38.603000 audit[4242]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff4c707270 a2=98 a3=3 items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.603000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.603000 audit: BPF prog-id=183 op=UNLOAD Jan 15 02:01:38.603000 audit[4242]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fff4c707240 a3=0 items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.603000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.603000 audit: BPF prog-id=184 op=LOAD Jan 15 02:01:38.603000 audit[4242]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff4c707060 a2=94 a3=54428f items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.603000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.603000 audit: BPF prog-id=184 op=UNLOAD Jan 15 02:01:38.603000 audit[4242]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff4c707060 a2=94 a3=54428f items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.603000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.603000 audit: BPF prog-id=185 op=LOAD Jan 15 02:01:38.603000 audit[4242]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff4c707090 a2=94 a3=2 items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.603000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.603000 audit: BPF prog-id=185 op=UNLOAD Jan 15 02:01:38.603000 audit[4242]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff4c707090 a2=0 a3=2 items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.603000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.764000 audit: BPF prog-id=186 op=LOAD Jan 15 02:01:38.764000 audit[4242]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff4c706f50 a2=94 a3=1 items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.764000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.764000 audit: BPF prog-id=186 op=UNLOAD Jan 15 02:01:38.764000 audit[4242]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7fff4c706f50 a2=94 a3=1 items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.764000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.774000 audit: BPF prog-id=187 op=LOAD Jan 15 02:01:38.774000 audit[4242]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff4c706f40 a2=94 a3=4 items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.774000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.774000 audit: BPF prog-id=187 op=UNLOAD Jan 15 02:01:38.774000 audit[4242]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7fff4c706f40 a2=0 a3=4 items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.774000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.774000 audit: BPF prog-id=188 op=LOAD Jan 15 02:01:38.774000 audit[4242]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff4c706da0 a2=94 a3=5 items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.774000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.774000 audit: BPF prog-id=188 op=UNLOAD Jan 15 02:01:38.774000 audit[4242]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff4c706da0 a2=0 a3=5 items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.774000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.774000 audit: BPF prog-id=189 op=LOAD Jan 15 02:01:38.774000 audit[4242]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff4c706fc0 a2=94 a3=6 items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.774000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.774000 audit: BPF prog-id=189 op=UNLOAD Jan 15 02:01:38.774000 audit[4242]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7fff4c706fc0 a2=0 a3=6 items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.774000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.774000 audit: BPF prog-id=190 op=LOAD Jan 15 02:01:38.774000 audit[4242]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7fff4c706770 a2=94 a3=88 items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.774000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.774000 audit: BPF prog-id=191 op=LOAD Jan 15 02:01:38.774000 audit[4242]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7fff4c7065f0 a2=94 a3=2 items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.774000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.774000 audit: BPF prog-id=191 op=UNLOAD Jan 15 02:01:38.774000 audit[4242]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7fff4c706620 a2=0 a3=7fff4c706720 items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.774000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.775000 audit: BPF prog-id=190 op=UNLOAD Jan 15 02:01:38.775000 audit[4242]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=2dc02d10 a2=0 a3=1ed6487d766ac4c6 items=0 ppid=4186 pid=4242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.775000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 15 02:01:38.885000 audit: BPF prog-id=192 op=LOAD Jan 15 02:01:38.885000 audit[4245]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc4006a1a0 a2=98 a3=1999999999999999 items=0 ppid=4186 pid=4245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.885000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 02:01:38.885000 audit: BPF prog-id=192 op=UNLOAD Jan 15 02:01:38.885000 audit[4245]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffc4006a170 a3=0 items=0 ppid=4186 pid=4245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.885000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 02:01:38.885000 audit: BPF prog-id=193 op=LOAD Jan 15 02:01:38.885000 audit[4245]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc4006a080 a2=94 a3=ffff items=0 ppid=4186 pid=4245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.885000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 02:01:38.885000 audit: BPF prog-id=193 op=UNLOAD Jan 15 02:01:38.885000 audit[4245]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffc4006a080 a2=94 a3=ffff items=0 ppid=4186 pid=4245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.885000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 02:01:38.885000 audit: BPF prog-id=194 op=LOAD Jan 15 02:01:38.885000 audit[4245]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc4006a0c0 a2=94 a3=7ffc4006a2a0 items=0 ppid=4186 pid=4245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.885000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 02:01:38.886000 audit: BPF prog-id=194 op=UNLOAD Jan 15 02:01:38.886000 audit[4245]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7ffc4006a0c0 a2=94 a3=7ffc4006a2a0 items=0 ppid=4186 pid=4245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:38.886000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 15 02:01:38.886670 systemd-networkd[1600]: cali96e36ad5ab4: Gained IPv6LL Jan 15 02:01:39.035040 containerd[1710]: time="2026-01-15T02:01:39.034969946Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:01:39.622660 systemd-networkd[1600]: vxlan.calico: Link UP Jan 15 02:01:39.622673 systemd-networkd[1600]: vxlan.calico: Gained carrier Jan 15 02:01:39.632789 containerd[1710]: time="2026-01-15T02:01:39.632382369Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 02:01:39.632789 containerd[1710]: time="2026-01-15T02:01:39.632538528Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 15 02:01:39.634212 kubelet[2935]: E0115 02:01:39.633074 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 02:01:39.634212 kubelet[2935]: E0115 02:01:39.633122 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 02:01:39.662000 audit: BPF prog-id=195 op=LOAD Jan 15 02:01:39.662000 audit[4272]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff909bf490 a2=98 a3=0 items=0 ppid=4186 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.662000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 02:01:39.662000 audit: BPF prog-id=195 op=UNLOAD Jan 15 02:01:39.662000 audit[4272]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7fff909bf460 a3=0 items=0 ppid=4186 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.662000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 02:01:39.662000 audit: BPF prog-id=196 op=LOAD Jan 15 02:01:39.662000 audit[4272]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff909bf2a0 a2=94 a3=54428f items=0 ppid=4186 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.662000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 02:01:39.663000 audit: BPF prog-id=196 op=UNLOAD Jan 15 02:01:39.663000 audit[4272]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff909bf2a0 a2=94 a3=54428f items=0 ppid=4186 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.663000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 02:01:39.663000 audit: BPF prog-id=197 op=LOAD Jan 15 02:01:39.663000 audit[4272]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff909bf2d0 a2=94 a3=2 items=0 ppid=4186 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.663000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 02:01:39.663000 audit: BPF prog-id=197 op=UNLOAD Jan 15 02:01:39.663000 audit[4272]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=7fff909bf2d0 a2=0 a3=2 items=0 ppid=4186 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.663000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 02:01:39.663000 audit: BPF prog-id=198 op=LOAD Jan 15 02:01:39.663000 audit[4272]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff909bf080 a2=94 a3=4 items=0 ppid=4186 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.663000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 02:01:39.663000 audit: BPF prog-id=198 op=UNLOAD Jan 15 02:01:39.663000 audit[4272]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff909bf080 a2=94 a3=4 items=0 ppid=4186 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.663000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 02:01:39.663000 audit: BPF prog-id=199 op=LOAD Jan 15 02:01:39.663000 audit[4272]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff909bf180 a2=94 a3=7fff909bf300 items=0 ppid=4186 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.663000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 02:01:39.663000 audit: BPF prog-id=199 op=UNLOAD Jan 15 02:01:39.663000 audit[4272]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff909bf180 a2=0 a3=7fff909bf300 items=0 ppid=4186 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.663000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 02:01:39.664000 audit: BPF prog-id=200 op=LOAD Jan 15 02:01:39.664000 audit[4272]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff909be8b0 a2=94 a3=2 items=0 ppid=4186 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.664000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 02:01:39.668000 audit: BPF prog-id=200 op=UNLOAD Jan 15 02:01:39.668000 audit[4272]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7fff909be8b0 a2=0 a3=2 items=0 ppid=4186 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.668000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 02:01:39.668000 audit: BPF prog-id=201 op=LOAD Jan 15 02:01:39.668000 audit[4272]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff909be9b0 a2=94 a3=30 items=0 ppid=4186 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.668000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 15 02:01:39.677000 audit: BPF prog-id=202 op=LOAD Jan 15 02:01:39.677000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe6d04c9e0 a2=98 a3=0 items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.677000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.677000 audit: BPF prog-id=202 op=UNLOAD Jan 15 02:01:39.677000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=3 a1=8 a2=7ffe6d04c9b0 a3=0 items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.677000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.677000 audit: BPF prog-id=203 op=LOAD Jan 15 02:01:39.677000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe6d04c7d0 a2=94 a3=54428f items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.677000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.677000 audit: BPF prog-id=203 op=UNLOAD Jan 15 02:01:39.677000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffe6d04c7d0 a2=94 a3=54428f items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.677000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.677000 audit: BPF prog-id=204 op=LOAD Jan 15 02:01:39.677000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe6d04c800 a2=94 a3=2 items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.677000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.679000 audit: BPF prog-id=204 op=UNLOAD Jan 15 02:01:39.679000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffe6d04c800 a2=0 a3=2 items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.679000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.694949 kubelet[2935]: E0115 02:01:39.694738 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:66ec0441fba04fd9bc4d56215e73797e,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-swvrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cd9b9db69-tblzt_calico-system(0839e1a2-38f7-4739-be41-8f605565d9d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 02:01:39.804233 containerd[1710]: time="2026-01-15T02:01:39.698579957Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 02:01:39.804233 containerd[1710]: time="2026-01-15T02:01:39.738637425Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hpxzl,Uid:2ea7b673-cd9f-4a24-98cf-dc76e6ed54b4,Namespace:kube-system,Attempt:0,}" Jan 15 02:01:39.804233 containerd[1710]: time="2026-01-15T02:01:39.738783405Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dcd9489f8-x7ppt,Uid:768894fc-e95b-49e5-9a90-1487b94ce02a,Namespace:calico-apiserver,Attempt:0,}" Jan 15 02:01:39.804233 containerd[1710]: time="2026-01-15T02:01:39.739198849Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hdqqp,Uid:328d556f-d445-4769-97a7-1a5530a232c4,Namespace:calico-system,Attempt:0,}" Jan 15 02:01:39.860000 audit: BPF prog-id=205 op=LOAD Jan 15 02:01:39.860000 audit: BPF prog-id=206 op=LOAD Jan 15 02:01:39.860000 audit[4299]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0238 a2=98 a3=0 items=0 ppid=4288 pid=4299 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.860000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303939633932353539623031353439363734353564383234666233 Jan 15 02:01:39.861000 audit: BPF prog-id=206 op=UNLOAD Jan 15 02:01:39.861000 audit[4299]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4288 pid=4299 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303939633932353539623031353439363734353564383234666233 Jan 15 02:01:39.861000 audit: BPF prog-id=207 op=LOAD Jan 15 02:01:39.861000 audit[4299]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b0488 a2=98 a3=0 items=0 ppid=4288 pid=4299 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303939633932353539623031353439363734353564383234666233 Jan 15 02:01:39.861000 audit: BPF prog-id=208 op=LOAD Jan 15 02:01:39.861000 audit[4299]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c0001b0218 a2=98 a3=0 items=0 ppid=4288 pid=4299 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303939633932353539623031353439363734353564383234666233 Jan 15 02:01:39.861000 audit: BPF prog-id=208 op=UNLOAD Jan 15 02:01:39.861000 audit[4299]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4288 pid=4299 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303939633932353539623031353439363734353564383234666233 Jan 15 02:01:39.861000 audit: BPF prog-id=207 op=UNLOAD Jan 15 02:01:39.861000 audit[4299]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4288 pid=4299 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303939633932353539623031353439363734353564383234666233 Jan 15 02:01:39.861000 audit: BPF prog-id=209 op=LOAD Jan 15 02:01:39.861000 audit[4299]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001b06e8 a2=98 a3=0 items=0 ppid=4288 pid=4299 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136303939633932353539623031353439363734353564383234666233 Jan 15 02:01:39.883000 audit: BPF prog-id=210 op=LOAD Jan 15 02:01:39.883000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe6d04c6c0 a2=94 a3=1 items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.883000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.883000 audit: BPF prog-id=210 op=UNLOAD Jan 15 02:01:39.883000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=4 a1=7ffe6d04c6c0 a2=94 a3=1 items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.883000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.905000 audit: BPF prog-id=211 op=LOAD Jan 15 02:01:39.905000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffe6d04c6b0 a2=94 a3=4 items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.905000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.905000 audit: BPF prog-id=211 op=UNLOAD Jan 15 02:01:39.905000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffe6d04c6b0 a2=0 a3=4 items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.905000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.906000 audit: BPF prog-id=212 op=LOAD Jan 15 02:01:39.906000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffe6d04c510 a2=94 a3=5 items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.906000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.906000 audit: BPF prog-id=212 op=UNLOAD Jan 15 02:01:39.906000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=6 a1=7ffe6d04c510 a2=0 a3=5 items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.906000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.906000 audit: BPF prog-id=213 op=LOAD Jan 15 02:01:39.906000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffe6d04c730 a2=94 a3=6 items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.906000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.906000 audit: BPF prog-id=213 op=UNLOAD Jan 15 02:01:39.906000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=7ffe6d04c730 a2=0 a3=6 items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.906000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.906000 audit: BPF prog-id=214 op=LOAD Jan 15 02:01:39.906000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=5 a1=7ffe6d04bee0 a2=94 a3=88 items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.906000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.906000 audit: BPF prog-id=215 op=LOAD Jan 15 02:01:39.906000 audit[4275]: SYSCALL arch=c000003e syscall=321 success=yes exit=7 a0=5 a1=7ffe6d04bd60 a2=94 a3=2 items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.906000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.906000 audit: BPF prog-id=215 op=UNLOAD Jan 15 02:01:39.906000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=7 a1=7ffe6d04bd90 a2=0 a3=7ffe6d04be90 items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.906000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.907000 audit: BPF prog-id=214 op=UNLOAD Jan 15 02:01:39.907000 audit[4275]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=5 a1=646fd10 a2=0 a3=979a233ff6ce1a75 items=0 ppid=4186 pid=4275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.907000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 15 02:01:39.716705 systemd-networkd[1600]: cali29d17085732: Gained IPv6LL Jan 15 02:01:39.912557 containerd[1710]: time="2026-01-15T02:01:39.816051237Z" level=info msg="connecting to shim 16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec" address="unix:///run/containerd/s/6128e97960b34b7853d146bec70694130fbde48ac1a333e60161144d69c29768" namespace=k8s.io protocol=ttrpc version=3 Jan 15 02:01:39.847436 systemd[1]: Started cri-containerd-16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec.scope - libcontainer container 16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec. Jan 15 02:01:39.915000 audit: BPF prog-id=201 op=UNLOAD Jan 15 02:01:39.915000 audit[4186]: SYSCALL arch=c000003e syscall=263 success=yes exit=0 a0=ffffffffffffff9c a1=c000ee2e80 a2=0 a3=0 items=0 ppid=4129 pid=4186 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:39.915000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 15 02:01:39.972518 systemd-networkd[1600]: cali5556ebaadbc: Gained IPv6LL Jan 15 02:01:40.012000 audit[4348]: NETFILTER_CFG table=mangle:119 family=2 entries=16 op=nft_register_chain pid=4348 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 02:01:40.012000 audit[4348]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7ffe1db0ae10 a2=0 a3=7ffe1db0adfc items=0 ppid=4186 pid=4348 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.012000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 02:01:40.018000 audit[4351]: NETFILTER_CFG table=nat:120 family=2 entries=15 op=nft_register_chain pid=4351 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 02:01:40.018000 audit[4351]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7ffe5206a400 a2=0 a3=7ffe5206a3ec items=0 ppid=4186 pid=4351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.018000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 02:01:40.020000 audit[4349]: NETFILTER_CFG table=raw:121 family=2 entries=21 op=nft_register_chain pid=4349 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 02:01:40.020000 audit[4349]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7fffbe0537a0 a2=0 a3=7fffbe05378c items=0 ppid=4186 pid=4349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.020000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 02:01:40.028000 audit[4352]: NETFILTER_CFG table=filter:122 family=2 entries=156 op=nft_register_chain pid=4352 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 02:01:40.028000 audit[4352]: SYSCALL arch=c000003e syscall=46 success=yes exit=89444 a0=3 a1=7fff8a351680 a2=0 a3=7fff8a35166c items=0 ppid=4186 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.028000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 02:01:40.121636 containerd[1710]: time="2026-01-15T02:01:40.121315140Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b978c6cc9-jnxw2,Uid:85d7699e-6a0a-4fd1-96b0-9365b90d23ad,Namespace:calico-system,Attempt:0,} returns sandbox id \"16099c92559b0154967455d824fb3a4ff843beb5aacffa6fb45fb3b43b1b36ec\"" Jan 15 02:01:40.137131 containerd[1710]: time="2026-01-15T02:01:40.137098525Z" level=info msg="connecting to shim c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a" address="unix:///run/containerd/s/6ccf2375f9a4acae614a9a94c92eb16f21b8ff8ebc148d0dfcf7f2868a7870be" namespace=k8s.io protocol=ttrpc version=3 Jan 15 02:01:40.186660 systemd[1]: Started cri-containerd-c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a.scope - libcontainer container c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a. Jan 15 02:01:40.216000 audit: BPF prog-id=216 op=LOAD Jan 15 02:01:40.217000 audit: BPF prog-id=217 op=LOAD Jan 15 02:01:40.217000 audit[4415]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4389 pid=4415 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.217000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339656464643835303138396430323634616232613432623130303631 Jan 15 02:01:40.217000 audit: BPF prog-id=217 op=UNLOAD Jan 15 02:01:40.217000 audit[4415]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4389 pid=4415 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.217000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339656464643835303138396430323634616232613432623130303631 Jan 15 02:01:40.217000 audit: BPF prog-id=218 op=LOAD Jan 15 02:01:40.217000 audit[4415]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4389 pid=4415 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.217000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339656464643835303138396430323634616232613432623130303631 Jan 15 02:01:40.217000 audit: BPF prog-id=219 op=LOAD Jan 15 02:01:40.217000 audit[4415]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4389 pid=4415 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.217000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339656464643835303138396430323634616232613432623130303631 Jan 15 02:01:40.218000 audit: BPF prog-id=219 op=UNLOAD Jan 15 02:01:40.218000 audit[4415]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4389 pid=4415 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.218000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339656464643835303138396430323634616232613432623130303631 Jan 15 02:01:40.218000 audit: BPF prog-id=218 op=UNLOAD Jan 15 02:01:40.218000 audit[4415]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4389 pid=4415 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.218000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339656464643835303138396430323634616232613432623130303631 Jan 15 02:01:40.218000 audit: BPF prog-id=220 op=LOAD Jan 15 02:01:40.218000 audit[4415]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4389 pid=4415 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.218000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339656464643835303138396430323634616232613432623130303631 Jan 15 02:01:40.290639 containerd[1710]: time="2026-01-15T02:01:40.290395207Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ljpnb,Uid:bf5cf22e-5674-4890-9d47-13e5c96cd4be,Namespace:kube-system,Attempt:0,} returns sandbox id \"c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a\"" Jan 15 02:01:40.300455 systemd-networkd[1600]: calie9812df1f30: Link UP Jan 15 02:01:40.302220 containerd[1710]: time="2026-01-15T02:01:40.302187633Z" level=info msg="CreateContainer within sandbox \"c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 15 02:01:40.305039 systemd-networkd[1600]: calie9812df1f30: Gained carrier Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.199 [INFO][4374] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--hpxzl-eth0 coredns-668d6bf9bc- kube-system 2ea7b673-cd9f-4a24-98cf-dc76e6ed54b4 843 0 2026-01-15 02:00:50 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4515-1-0-n-e5e35ee394 coredns-668d6bf9bc-hpxzl eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie9812df1f30 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" Namespace="kube-system" Pod="coredns-668d6bf9bc-hpxzl" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--hpxzl-" Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.200 [INFO][4374] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" Namespace="kube-system" Pod="coredns-668d6bf9bc-hpxzl" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--hpxzl-eth0" Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.238 [INFO][4445] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" HandleID="k8s-pod-network.694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" Workload="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--hpxzl-eth0" Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.238 [INFO][4445] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" HandleID="k8s-pod-network.694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" Workload="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--hpxzl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4fe0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4515-1-0-n-e5e35ee394", "pod":"coredns-668d6bf9bc-hpxzl", "timestamp":"2026-01-15 02:01:40.238402236 +0000 UTC"}, Hostname:"ci-4515-1-0-n-e5e35ee394", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.238 [INFO][4445] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.238 [INFO][4445] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.238 [INFO][4445] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-e5e35ee394' Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.245 [INFO][4445] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.253 [INFO][4445] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.262 [INFO][4445] ipam/ipam.go 511: Trying affinity for 192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.266 [INFO][4445] ipam/ipam.go 158: Attempting to load block cidr=192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.276 [INFO][4445] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.276 [INFO][4445] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.92.64/26 handle="k8s-pod-network.694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.278 [INFO][4445] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.283 [INFO][4445] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.92.64/26 handle="k8s-pod-network.694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.290 [INFO][4445] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.92.68/26] block=192.168.92.64/26 handle="k8s-pod-network.694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.290 [INFO][4445] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.92.68/26] handle="k8s-pod-network.694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.290 [INFO][4445] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 02:01:40.321223 containerd[1710]: 2026-01-15 02:01:40.290 [INFO][4445] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.92.68/26] IPv6=[] ContainerID="694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" HandleID="k8s-pod-network.694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" Workload="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--hpxzl-eth0" Jan 15 02:01:40.321705 containerd[1710]: 2026-01-15 02:01:40.295 [INFO][4374] cni-plugin/k8s.go 418: Populated endpoint ContainerID="694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" Namespace="kube-system" Pod="coredns-668d6bf9bc-hpxzl" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--hpxzl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--hpxzl-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"2ea7b673-cd9f-4a24-98cf-dc76e6ed54b4", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 2, 0, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-e5e35ee394", ContainerID:"", Pod:"coredns-668d6bf9bc-hpxzl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.92.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie9812df1f30", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 02:01:40.321705 containerd[1710]: 2026-01-15 02:01:40.296 [INFO][4374] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.92.68/32] ContainerID="694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" Namespace="kube-system" Pod="coredns-668d6bf9bc-hpxzl" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--hpxzl-eth0" Jan 15 02:01:40.321705 containerd[1710]: 2026-01-15 02:01:40.296 [INFO][4374] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie9812df1f30 ContainerID="694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" Namespace="kube-system" Pod="coredns-668d6bf9bc-hpxzl" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--hpxzl-eth0" Jan 15 02:01:40.321705 containerd[1710]: 2026-01-15 02:01:40.303 [INFO][4374] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" Namespace="kube-system" Pod="coredns-668d6bf9bc-hpxzl" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--hpxzl-eth0" Jan 15 02:01:40.321705 containerd[1710]: 2026-01-15 02:01:40.305 [INFO][4374] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" Namespace="kube-system" Pod="coredns-668d6bf9bc-hpxzl" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--hpxzl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--hpxzl-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"2ea7b673-cd9f-4a24-98cf-dc76e6ed54b4", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 2, 0, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-e5e35ee394", ContainerID:"694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d", Pod:"coredns-668d6bf9bc-hpxzl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.92.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie9812df1f30", MAC:"b6:92:19:87:41:b4", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 02:01:40.321705 containerd[1710]: 2026-01-15 02:01:40.318 [INFO][4374] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" Namespace="kube-system" Pod="coredns-668d6bf9bc-hpxzl" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-coredns--668d6bf9bc--hpxzl-eth0" Jan 15 02:01:40.324495 containerd[1710]: time="2026-01-15T02:01:40.324357804Z" level=info msg="Container 21d8a13139a1d82d00029c717f5ff2d46abd150d0bab3b77c3d4f000ada8d58c: CDI devices from CRI Config.CDIDevices: []" Jan 15 02:01:40.336406 containerd[1710]: time="2026-01-15T02:01:40.336241454Z" level=info msg="CreateContainer within sandbox \"c9eddd850189d0264ab2a42b10061c350442d9676c39352f3a1fdd1c1e88903a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"21d8a13139a1d82d00029c717f5ff2d46abd150d0bab3b77c3d4f000ada8d58c\"" Jan 15 02:01:40.337785 containerd[1710]: time="2026-01-15T02:01:40.337759877Z" level=info msg="StartContainer for \"21d8a13139a1d82d00029c717f5ff2d46abd150d0bab3b77c3d4f000ada8d58c\"" Jan 15 02:01:40.339232 containerd[1710]: time="2026-01-15T02:01:40.339208152Z" level=info msg="connecting to shim 21d8a13139a1d82d00029c717f5ff2d46abd150d0bab3b77c3d4f000ada8d58c" address="unix:///run/containerd/s/6ccf2375f9a4acae614a9a94c92eb16f21b8ff8ebc148d0dfcf7f2868a7870be" protocol=ttrpc version=3 Jan 15 02:01:40.346000 audit[4477]: NETFILTER_CFG table=filter:123 family=2 entries=40 op=nft_register_chain pid=4477 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 02:01:40.346000 audit[4477]: SYSCALL arch=c000003e syscall=46 success=yes exit=20344 a0=3 a1=7ffc801fc110 a2=0 a3=7ffc801fc0fc items=0 ppid=4186 pid=4477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.346000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 02:01:40.380182 containerd[1710]: time="2026-01-15T02:01:40.378662395Z" level=info msg="connecting to shim 694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d" address="unix:///run/containerd/s/1e23991cd1f09b974652fcb020fca9ab0c0cec464b156bcf4d992fcf8af9c902" namespace=k8s.io protocol=ttrpc version=3 Jan 15 02:01:40.385877 systemd[1]: Started cri-containerd-21d8a13139a1d82d00029c717f5ff2d46abd150d0bab3b77c3d4f000ada8d58c.scope - libcontainer container 21d8a13139a1d82d00029c717f5ff2d46abd150d0bab3b77c3d4f000ada8d58c. Jan 15 02:01:40.409803 systemd-networkd[1600]: cali844445d8607: Link UP Jan 15 02:01:40.413291 containerd[1710]: time="2026-01-15T02:01:40.413264720Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:01:40.414127 systemd-networkd[1600]: cali844445d8607: Gained carrier Jan 15 02:01:40.415491 containerd[1710]: time="2026-01-15T02:01:40.415459381Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 02:01:40.416000 audit: BPF prog-id=221 op=LOAD Jan 15 02:01:40.417807 containerd[1710]: time="2026-01-15T02:01:40.415529631Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 15 02:01:40.417807 containerd[1710]: time="2026-01-15T02:01:40.417290338Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 02:01:40.417868 kubelet[2935]: E0115 02:01:40.415679 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 02:01:40.417868 kubelet[2935]: E0115 02:01:40.415719 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 02:01:40.417951 kubelet[2935]: E0115 02:01:40.415898 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-swvrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cd9b9db69-tblzt_calico-system(0839e1a2-38f7-4739-be41-8f605565d9d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 02:01:40.418807 kubelet[2935]: E0115 02:01:40.418725 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cd9b9db69-tblzt" podUID="0839e1a2-38f7-4739-be41-8f605565d9d2" Jan 15 02:01:40.418000 audit: BPF prog-id=222 op=LOAD Jan 15 02:01:40.418000 audit[4478]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106238 a2=98 a3=0 items=0 ppid=4389 pid=4478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231643861313331333961316438326430303032396337313766356666 Jan 15 02:01:40.418000 audit: BPF prog-id=222 op=UNLOAD Jan 15 02:01:40.418000 audit[4478]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4389 pid=4478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231643861313331333961316438326430303032396337313766356666 Jan 15 02:01:40.418000 audit: BPF prog-id=223 op=LOAD Jan 15 02:01:40.418000 audit[4478]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000106488 a2=98 a3=0 items=0 ppid=4389 pid=4478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231643861313331333961316438326430303032396337313766356666 Jan 15 02:01:40.419000 audit: BPF prog-id=224 op=LOAD Jan 15 02:01:40.419000 audit[4478]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000106218 a2=98 a3=0 items=0 ppid=4389 pid=4478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.419000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231643861313331333961316438326430303032396337313766356666 Jan 15 02:01:40.419000 audit: BPF prog-id=224 op=UNLOAD Jan 15 02:01:40.419000 audit[4478]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4389 pid=4478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.419000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231643861313331333961316438326430303032396337313766356666 Jan 15 02:01:40.419000 audit: BPF prog-id=223 op=UNLOAD Jan 15 02:01:40.419000 audit[4478]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4389 pid=4478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.419000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231643861313331333961316438326430303032396337313766356666 Jan 15 02:01:40.419000 audit: BPF prog-id=225 op=LOAD Jan 15 02:01:40.419000 audit[4478]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001066e8 a2=98 a3=0 items=0 ppid=4389 pid=4478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.419000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231643861313331333961316438326430303032396337313766356666 Jan 15 02:01:40.426827 systemd[1]: Started cri-containerd-694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d.scope - libcontainer container 694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d. Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.191 [INFO][4365] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--e5e35ee394-k8s-csi--node--driver--hdqqp-eth0 csi-node-driver- calico-system 328d556f-d445-4769-97a7-1a5530a232c4 732 0 2026-01-15 02:01:13 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4515-1-0-n-e5e35ee394 csi-node-driver-hdqqp eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali844445d8607 [] [] }} ContainerID="0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" Namespace="calico-system" Pod="csi-node-driver-hdqqp" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-csi--node--driver--hdqqp-" Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.191 [INFO][4365] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" Namespace="calico-system" Pod="csi-node-driver-hdqqp" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-csi--node--driver--hdqqp-eth0" Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.265 [INFO][4439] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" HandleID="k8s-pod-network.0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" Workload="ci--4515--1--0--n--e5e35ee394-k8s-csi--node--driver--hdqqp-eth0" Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.265 [INFO][4439] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" HandleID="k8s-pod-network.0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" Workload="ci--4515--1--0--n--e5e35ee394-k8s-csi--node--driver--hdqqp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00025bba0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-n-e5e35ee394", "pod":"csi-node-driver-hdqqp", "timestamp":"2026-01-15 02:01:40.265097712 +0000 UTC"}, Hostname:"ci-4515-1-0-n-e5e35ee394", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.265 [INFO][4439] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.290 [INFO][4439] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.290 [INFO][4439] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-e5e35ee394' Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.345 [INFO][4439] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.353 [INFO][4439] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.362 [INFO][4439] ipam/ipam.go 511: Trying affinity for 192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.371 [INFO][4439] ipam/ipam.go 158: Attempting to load block cidr=192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.373 [INFO][4439] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.373 [INFO][4439] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.92.64/26 handle="k8s-pod-network.0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.375 [INFO][4439] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0 Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.385 [INFO][4439] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.92.64/26 handle="k8s-pod-network.0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.391 [INFO][4439] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.92.69/26] block=192.168.92.64/26 handle="k8s-pod-network.0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.391 [INFO][4439] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.92.69/26] handle="k8s-pod-network.0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.391 [INFO][4439] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 02:01:40.453733 containerd[1710]: 2026-01-15 02:01:40.391 [INFO][4439] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.92.69/26] IPv6=[] ContainerID="0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" HandleID="k8s-pod-network.0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" Workload="ci--4515--1--0--n--e5e35ee394-k8s-csi--node--driver--hdqqp-eth0" Jan 15 02:01:40.456873 containerd[1710]: 2026-01-15 02:01:40.395 [INFO][4365] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" Namespace="calico-system" Pod="csi-node-driver-hdqqp" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-csi--node--driver--hdqqp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--e5e35ee394-k8s-csi--node--driver--hdqqp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"328d556f-d445-4769-97a7-1a5530a232c4", ResourceVersion:"732", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 2, 1, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-e5e35ee394", ContainerID:"", Pod:"csi-node-driver-hdqqp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.92.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali844445d8607", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 02:01:40.456873 containerd[1710]: 2026-01-15 02:01:40.395 [INFO][4365] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.92.69/32] ContainerID="0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" Namespace="calico-system" Pod="csi-node-driver-hdqqp" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-csi--node--driver--hdqqp-eth0" Jan 15 02:01:40.456873 containerd[1710]: 2026-01-15 02:01:40.395 [INFO][4365] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali844445d8607 ContainerID="0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" Namespace="calico-system" Pod="csi-node-driver-hdqqp" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-csi--node--driver--hdqqp-eth0" Jan 15 02:01:40.456873 containerd[1710]: 2026-01-15 02:01:40.415 [INFO][4365] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" Namespace="calico-system" Pod="csi-node-driver-hdqqp" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-csi--node--driver--hdqqp-eth0" Jan 15 02:01:40.456873 containerd[1710]: 2026-01-15 02:01:40.419 [INFO][4365] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" Namespace="calico-system" Pod="csi-node-driver-hdqqp" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-csi--node--driver--hdqqp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--e5e35ee394-k8s-csi--node--driver--hdqqp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"328d556f-d445-4769-97a7-1a5530a232c4", ResourceVersion:"732", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 2, 1, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-e5e35ee394", ContainerID:"0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0", Pod:"csi-node-driver-hdqqp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.92.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali844445d8607", MAC:"5a:db:ad:4c:f2:70", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 02:01:40.456873 containerd[1710]: 2026-01-15 02:01:40.449 [INFO][4365] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" Namespace="calico-system" Pod="csi-node-driver-hdqqp" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-csi--node--driver--hdqqp-eth0" Jan 15 02:01:40.459000 audit: BPF prog-id=226 op=LOAD Jan 15 02:01:40.460000 audit: BPF prog-id=227 op=LOAD Jan 15 02:01:40.460000 audit[4508]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c238 a2=98 a3=0 items=0 ppid=4497 pid=4508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639346231643331316237396330393266306333323563333666313766 Jan 15 02:01:40.460000 audit: BPF prog-id=227 op=UNLOAD Jan 15 02:01:40.460000 audit[4508]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4497 pid=4508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639346231643331316237396330393266306333323563333666313766 Jan 15 02:01:40.460000 audit: BPF prog-id=228 op=LOAD Jan 15 02:01:40.460000 audit[4508]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c488 a2=98 a3=0 items=0 ppid=4497 pid=4508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639346231643331316237396330393266306333323563333666313766 Jan 15 02:01:40.460000 audit: BPF prog-id=229 op=LOAD Jan 15 02:01:40.460000 audit[4508]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c00010c218 a2=98 a3=0 items=0 ppid=4497 pid=4508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639346231643331316237396330393266306333323563333666313766 Jan 15 02:01:40.460000 audit: BPF prog-id=229 op=UNLOAD Jan 15 02:01:40.460000 audit[4508]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4497 pid=4508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639346231643331316237396330393266306333323563333666313766 Jan 15 02:01:40.460000 audit: BPF prog-id=228 op=UNLOAD Jan 15 02:01:40.460000 audit[4508]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4497 pid=4508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639346231643331316237396330393266306333323563333666313766 Jan 15 02:01:40.460000 audit: BPF prog-id=230 op=LOAD Jan 15 02:01:40.460000 audit[4508]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c00010c6e8 a2=98 a3=0 items=0 ppid=4497 pid=4508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639346231643331316237396330393266306333323563333666313766 Jan 15 02:01:40.462729 containerd[1710]: time="2026-01-15T02:01:40.461059322Z" level=info msg="StartContainer for \"21d8a13139a1d82d00029c717f5ff2d46abd150d0bab3b77c3d4f000ada8d58c\" returns successfully" Jan 15 02:01:40.480000 audit[4551]: NETFILTER_CFG table=filter:124 family=2 entries=48 op=nft_register_chain pid=4551 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 02:01:40.480000 audit[4551]: SYSCALL arch=c000003e syscall=46 success=yes exit=23140 a0=3 a1=7ffc33b8d390 a2=0 a3=7ffc33b8d37c items=0 ppid=4186 pid=4551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.480000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 02:01:40.495409 containerd[1710]: time="2026-01-15T02:01:40.495288230Z" level=info msg="connecting to shim 0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0" address="unix:///run/containerd/s/d2eaf746b315a11ca449a14c6a3d80dd491a0df2004a13583e3935da4e9c65c5" namespace=k8s.io protocol=ttrpc version=3 Jan 15 02:01:40.501353 systemd-networkd[1600]: cali2b2fc191187: Link UP Jan 15 02:01:40.501483 systemd-networkd[1600]: cali2b2fc191187: Gained carrier Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.195 [INFO][4381] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--x7ppt-eth0 calico-apiserver-5dcd9489f8- calico-apiserver 768894fc-e95b-49e5-9a90-1487b94ce02a 851 0 2026-01-15 02:01:09 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5dcd9489f8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4515-1-0-n-e5e35ee394 calico-apiserver-5dcd9489f8-x7ppt eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali2b2fc191187 [] [] }} ContainerID="88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" Namespace="calico-apiserver" Pod="calico-apiserver-5dcd9489f8-x7ppt" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--x7ppt-" Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.195 [INFO][4381] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" Namespace="calico-apiserver" Pod="calico-apiserver-5dcd9489f8-x7ppt" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--x7ppt-eth0" Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.281 [INFO][4450] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" HandleID="k8s-pod-network.88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" Workload="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--x7ppt-eth0" Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.282 [INFO][4450] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" HandleID="k8s-pod-network.88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" Workload="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--x7ppt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000323380), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4515-1-0-n-e5e35ee394", "pod":"calico-apiserver-5dcd9489f8-x7ppt", "timestamp":"2026-01-15 02:01:40.281750862 +0000 UTC"}, Hostname:"ci-4515-1-0-n-e5e35ee394", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.282 [INFO][4450] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.392 [INFO][4450] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.393 [INFO][4450] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-e5e35ee394' Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.448 [INFO][4450] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.458 [INFO][4450] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.465 [INFO][4450] ipam/ipam.go 511: Trying affinity for 192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.468 [INFO][4450] ipam/ipam.go 158: Attempting to load block cidr=192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.471 [INFO][4450] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.474 [INFO][4450] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.92.64/26 handle="k8s-pod-network.88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.475 [INFO][4450] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.482 [INFO][4450] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.92.64/26 handle="k8s-pod-network.88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.493 [INFO][4450] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.92.70/26] block=192.168.92.64/26 handle="k8s-pod-network.88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.493 [INFO][4450] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.92.70/26] handle="k8s-pod-network.88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.494 [INFO][4450] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 02:01:40.530346 containerd[1710]: 2026-01-15 02:01:40.494 [INFO][4450] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.92.70/26] IPv6=[] ContainerID="88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" HandleID="k8s-pod-network.88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" Workload="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--x7ppt-eth0" Jan 15 02:01:40.530843 containerd[1710]: 2026-01-15 02:01:40.498 [INFO][4381] cni-plugin/k8s.go 418: Populated endpoint ContainerID="88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" Namespace="calico-apiserver" Pod="calico-apiserver-5dcd9489f8-x7ppt" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--x7ppt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--x7ppt-eth0", GenerateName:"calico-apiserver-5dcd9489f8-", Namespace:"calico-apiserver", SelfLink:"", UID:"768894fc-e95b-49e5-9a90-1487b94ce02a", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 2, 1, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5dcd9489f8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-e5e35ee394", ContainerID:"", Pod:"calico-apiserver-5dcd9489f8-x7ppt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.92.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2b2fc191187", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 02:01:40.530843 containerd[1710]: 2026-01-15 02:01:40.498 [INFO][4381] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.92.70/32] ContainerID="88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" Namespace="calico-apiserver" Pod="calico-apiserver-5dcd9489f8-x7ppt" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--x7ppt-eth0" Jan 15 02:01:40.530843 containerd[1710]: 2026-01-15 02:01:40.498 [INFO][4381] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2b2fc191187 ContainerID="88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" Namespace="calico-apiserver" Pod="calico-apiserver-5dcd9489f8-x7ppt" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--x7ppt-eth0" Jan 15 02:01:40.530843 containerd[1710]: 2026-01-15 02:01:40.501 [INFO][4381] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" Namespace="calico-apiserver" Pod="calico-apiserver-5dcd9489f8-x7ppt" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--x7ppt-eth0" Jan 15 02:01:40.530843 containerd[1710]: 2026-01-15 02:01:40.502 [INFO][4381] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" Namespace="calico-apiserver" Pod="calico-apiserver-5dcd9489f8-x7ppt" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--x7ppt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--x7ppt-eth0", GenerateName:"calico-apiserver-5dcd9489f8-", Namespace:"calico-apiserver", SelfLink:"", UID:"768894fc-e95b-49e5-9a90-1487b94ce02a", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 2, 1, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5dcd9489f8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-e5e35ee394", ContainerID:"88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b", Pod:"calico-apiserver-5dcd9489f8-x7ppt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.92.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2b2fc191187", MAC:"26:9e:7c:4d:e2:0e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 02:01:40.530843 containerd[1710]: 2026-01-15 02:01:40.524 [INFO][4381] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" Namespace="calico-apiserver" Pod="calico-apiserver-5dcd9489f8-x7ppt" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--x7ppt-eth0" Jan 15 02:01:40.539561 systemd[1]: Started cri-containerd-0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0.scope - libcontainer container 0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0. Jan 15 02:01:40.555000 audit: BPF prog-id=231 op=LOAD Jan 15 02:01:40.556000 audit: BPF prog-id=232 op=LOAD Jan 15 02:01:40.556000 audit[4578]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000220238 a2=98 a3=0 items=0 ppid=4564 pid=4578 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.556000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034313534303062623836313232356365313031656635393735646639 Jan 15 02:01:40.556000 audit: BPF prog-id=232 op=UNLOAD Jan 15 02:01:40.556000 audit[4578]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4564 pid=4578 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.556000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034313534303062623836313232356365313031656635393735646639 Jan 15 02:01:40.557000 audit: BPF prog-id=233 op=LOAD Jan 15 02:01:40.557000 audit[4578]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000220488 a2=98 a3=0 items=0 ppid=4564 pid=4578 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.557000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034313534303062623836313232356365313031656635393735646639 Jan 15 02:01:40.557000 audit: BPF prog-id=234 op=LOAD Jan 15 02:01:40.557000 audit[4578]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000220218 a2=98 a3=0 items=0 ppid=4564 pid=4578 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.557000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034313534303062623836313232356365313031656635393735646639 Jan 15 02:01:40.557000 audit: BPF prog-id=234 op=UNLOAD Jan 15 02:01:40.557000 audit[4578]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4564 pid=4578 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.557000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034313534303062623836313232356365313031656635393735646639 Jan 15 02:01:40.557000 audit: BPF prog-id=233 op=UNLOAD Jan 15 02:01:40.557000 audit[4578]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4564 pid=4578 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.557000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034313534303062623836313232356365313031656635393735646639 Jan 15 02:01:40.557000 audit: BPF prog-id=235 op=LOAD Jan 15 02:01:40.557000 audit[4578]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0002206e8 a2=98 a3=0 items=0 ppid=4564 pid=4578 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.557000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3034313534303062623836313232356365313031656635393735646639 Jan 15 02:01:40.558241 containerd[1710]: time="2026-01-15T02:01:40.558189609Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hpxzl,Uid:2ea7b673-cd9f-4a24-98cf-dc76e6ed54b4,Namespace:kube-system,Attempt:0,} returns sandbox id \"694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d\"" Jan 15 02:01:40.562216 containerd[1710]: time="2026-01-15T02:01:40.561820580Z" level=info msg="CreateContainer within sandbox \"694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 15 02:01:40.572000 audit[4607]: NETFILTER_CFG table=filter:125 family=2 entries=72 op=nft_register_chain pid=4607 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 02:01:40.572000 audit[4607]: SYSCALL arch=c000003e syscall=46 success=yes exit=35812 a0=3 a1=7ffdcd5b5b20 a2=0 a3=7ffdcd5b5b0c items=0 ppid=4186 pid=4607 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.572000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 02:01:40.576936 containerd[1710]: time="2026-01-15T02:01:40.576874558Z" level=info msg="Container 3854e8412970b56092c33e5923669243c5e62f7fab95ed541e0b550d3d72c52a: CDI devices from CRI Config.CDIDevices: []" Jan 15 02:01:40.583622 containerd[1710]: time="2026-01-15T02:01:40.583586098Z" level=info msg="connecting to shim 88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b" address="unix:///run/containerd/s/2f1e0bb74c77de1960070eb39e596ebacb3581dea2a47e6a7eb3b6eb16122878" namespace=k8s.io protocol=ttrpc version=3 Jan 15 02:01:40.596883 containerd[1710]: time="2026-01-15T02:01:40.596834049Z" level=info msg="CreateContainer within sandbox \"694b1d311b79c092f0c325c36f17fd4e2066a6d8d0316a8f363ac32a9b85ff6d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3854e8412970b56092c33e5923669243c5e62f7fab95ed541e0b550d3d72c52a\"" Jan 15 02:01:40.599053 containerd[1710]: time="2026-01-15T02:01:40.598641793Z" level=info msg="StartContainer for \"3854e8412970b56092c33e5923669243c5e62f7fab95ed541e0b550d3d72c52a\"" Jan 15 02:01:40.600945 containerd[1710]: time="2026-01-15T02:01:40.600921628Z" level=info msg="connecting to shim 3854e8412970b56092c33e5923669243c5e62f7fab95ed541e0b550d3d72c52a" address="unix:///run/containerd/s/1e23991cd1f09b974652fcb020fca9ab0c0cec464b156bcf4d992fcf8af9c902" protocol=ttrpc version=3 Jan 15 02:01:40.601916 containerd[1710]: time="2026-01-15T02:01:40.601118884Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hdqqp,Uid:328d556f-d445-4769-97a7-1a5530a232c4,Namespace:calico-system,Attempt:0,} returns sandbox id \"0415400bb861225ce101ef5975df9981d8f56dd857463341d45f736e57519fd0\"" Jan 15 02:01:40.625336 systemd[1]: Started cri-containerd-88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b.scope - libcontainer container 88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b. Jan 15 02:01:40.629300 systemd[1]: Started cri-containerd-3854e8412970b56092c33e5923669243c5e62f7fab95ed541e0b550d3d72c52a.scope - libcontainer container 3854e8412970b56092c33e5923669243c5e62f7fab95ed541e0b550d3d72c52a. Jan 15 02:01:40.641000 audit: BPF prog-id=236 op=LOAD Jan 15 02:01:40.641000 audit: BPF prog-id=237 op=LOAD Jan 15 02:01:40.641000 audit[4637]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000128238 a2=98 a3=0 items=0 ppid=4497 pid=4637 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.641000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338353465383431323937306235363039326333336535393233363639 Jan 15 02:01:40.641000 audit: BPF prog-id=237 op=UNLOAD Jan 15 02:01:40.641000 audit[4637]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4497 pid=4637 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.641000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338353465383431323937306235363039326333336535393233363639 Jan 15 02:01:40.641000 audit: BPF prog-id=238 op=LOAD Jan 15 02:01:40.641000 audit[4637]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000128488 a2=98 a3=0 items=0 ppid=4497 pid=4637 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.641000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338353465383431323937306235363039326333336535393233363639 Jan 15 02:01:40.641000 audit: BPF prog-id=239 op=LOAD Jan 15 02:01:40.641000 audit[4637]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000128218 a2=98 a3=0 items=0 ppid=4497 pid=4637 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.641000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338353465383431323937306235363039326333336535393233363639 Jan 15 02:01:40.642000 audit: BPF prog-id=239 op=UNLOAD Jan 15 02:01:40.642000 audit[4637]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4497 pid=4637 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.642000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338353465383431323937306235363039326333336535393233363639 Jan 15 02:01:40.642000 audit: BPF prog-id=238 op=UNLOAD Jan 15 02:01:40.642000 audit[4637]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4497 pid=4637 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.642000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338353465383431323937306235363039326333336535393233363639 Jan 15 02:01:40.642000 audit: BPF prog-id=240 op=LOAD Jan 15 02:01:40.642000 audit[4637]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001286e8 a2=98 a3=0 items=0 ppid=4497 pid=4637 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.642000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338353465383431323937306235363039326333336535393233363639 Jan 15 02:01:40.645000 audit: BPF prog-id=241 op=LOAD Jan 15 02:01:40.646000 audit: BPF prog-id=242 op=LOAD Jan 15 02:01:40.646000 audit[4634]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178238 a2=98 a3=0 items=0 ppid=4618 pid=4634 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.646000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838616564623032316139356132346464386236633134623366616366 Jan 15 02:01:40.646000 audit: BPF prog-id=242 op=UNLOAD Jan 15 02:01:40.646000 audit[4634]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4618 pid=4634 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.646000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838616564623032316139356132346464386236633134623366616366 Jan 15 02:01:40.646000 audit: BPF prog-id=243 op=LOAD Jan 15 02:01:40.646000 audit[4634]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=4618 pid=4634 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.646000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838616564623032316139356132346464386236633134623366616366 Jan 15 02:01:40.646000 audit: BPF prog-id=244 op=LOAD Jan 15 02:01:40.646000 audit[4634]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=4618 pid=4634 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.646000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838616564623032316139356132346464386236633134623366616366 Jan 15 02:01:40.646000 audit: BPF prog-id=244 op=UNLOAD Jan 15 02:01:40.646000 audit[4634]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4618 pid=4634 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.646000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838616564623032316139356132346464386236633134623366616366 Jan 15 02:01:40.646000 audit: BPF prog-id=243 op=UNLOAD Jan 15 02:01:40.646000 audit[4634]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4618 pid=4634 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.646000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838616564623032316139356132346464386236633134623366616366 Jan 15 02:01:40.646000 audit: BPF prog-id=245 op=LOAD Jan 15 02:01:40.646000 audit[4634]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001786e8 a2=98 a3=0 items=0 ppid=4618 pid=4634 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:40.646000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3838616564623032316139356132346464386236633134623366616366 Jan 15 02:01:40.665145 containerd[1710]: time="2026-01-15T02:01:40.665118802Z" level=info msg="StartContainer for \"3854e8412970b56092c33e5923669243c5e62f7fab95ed541e0b550d3d72c52a\" returns successfully" Jan 15 02:01:40.718179 containerd[1710]: time="2026-01-15T02:01:40.717293670Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dcd9489f8-x7ppt,Uid:768894fc-e95b-49e5-9a90-1487b94ce02a,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"88aedb021a95a24dd8b6c14b3facfc73fe90891dca6c7d7b1898d9e542ab384b\"" Jan 15 02:01:40.740320 systemd-networkd[1600]: vxlan.calico: Gained IPv6LL Jan 15 02:01:40.757419 containerd[1710]: time="2026-01-15T02:01:40.757318205Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:01:40.761495 containerd[1710]: time="2026-01-15T02:01:40.761472988Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 02:01:40.761613 containerd[1710]: time="2026-01-15T02:01:40.761492195Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 15 02:01:40.761757 kubelet[2935]: E0115 02:01:40.761734 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 02:01:40.762610 kubelet[2935]: E0115 02:01:40.761988 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 02:01:40.762610 kubelet[2935]: E0115 02:01:40.762176 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2xk5t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7b978c6cc9-jnxw2_calico-system(85d7699e-6a0a-4fd1-96b0-9365b90d23ad): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 02:01:40.762772 containerd[1710]: time="2026-01-15T02:01:40.762712807Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 02:01:40.764114 kubelet[2935]: E0115 02:01:40.764090 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" podUID="85d7699e-6a0a-4fd1-96b0-9365b90d23ad" Jan 15 02:01:41.012975 kubelet[2935]: E0115 02:01:41.012432 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" podUID="85d7699e-6a0a-4fd1-96b0-9365b90d23ad" Jan 15 02:01:41.017180 kubelet[2935]: E0115 02:01:41.015779 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cd9b9db69-tblzt" podUID="0839e1a2-38f7-4739-be41-8f605565d9d2" Jan 15 02:01:41.027187 kubelet[2935]: I0115 02:01:41.026919 2935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-ljpnb" podStartSLOduration=51.026865986 podStartE2EDuration="51.026865986s" podCreationTimestamp="2026-01-15 02:00:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 02:01:41.02500928 +0000 UTC m=+57.403001223" watchObservedRunningTime="2026-01-15 02:01:41.026865986 +0000 UTC m=+57.404857915" Jan 15 02:01:41.084000 audit[4700]: NETFILTER_CFG table=filter:126 family=2 entries=20 op=nft_register_rule pid=4700 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:41.084000 audit[4700]: SYSCALL arch=c000003e syscall=46 success=yes exit=7480 a0=3 a1=7ffc27614680 a2=0 a3=7ffc2761466c items=0 ppid=3037 pid=4700 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:41.084000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:41.091000 audit[4700]: NETFILTER_CFG table=nat:127 family=2 entries=14 op=nft_register_rule pid=4700 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:41.091000 audit[4700]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffc27614680 a2=0 a3=0 items=0 ppid=3037 pid=4700 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:41.091000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:41.103401 containerd[1710]: time="2026-01-15T02:01:41.103363263Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:01:41.105407 containerd[1710]: time="2026-01-15T02:01:41.105371563Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 02:01:41.105510 containerd[1710]: time="2026-01-15T02:01:41.105454219Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 15 02:01:41.105814 kubelet[2935]: E0115 02:01:41.105682 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 02:01:41.105874 kubelet[2935]: E0115 02:01:41.105824 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 02:01:41.108209 kubelet[2935]: E0115 02:01:41.106330 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ktgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hdqqp_calico-system(328d556f-d445-4769-97a7-1a5530a232c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 02:01:41.108348 containerd[1710]: time="2026-01-15T02:01:41.108268862Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 02:01:41.115000 audit[4702]: NETFILTER_CFG table=filter:128 family=2 entries=17 op=nft_register_rule pid=4702 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:41.115000 audit[4702]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffe02ce98e0 a2=0 a3=7ffe02ce98cc items=0 ppid=3037 pid=4702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:41.115000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:41.118000 audit[4702]: NETFILTER_CFG table=nat:129 family=2 entries=35 op=nft_register_chain pid=4702 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:41.118000 audit[4702]: SYSCALL arch=c000003e syscall=46 success=yes exit=14196 a0=3 a1=7ffe02ce98e0 a2=0 a3=7ffe02ce98cc items=0 ppid=3037 pid=4702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:41.118000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:41.455937 containerd[1710]: time="2026-01-15T02:01:41.455841245Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:01:41.459188 containerd[1710]: time="2026-01-15T02:01:41.459100562Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 02:01:41.459461 containerd[1710]: time="2026-01-15T02:01:41.459340586Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 02:01:41.460285 kubelet[2935]: E0115 02:01:41.459819 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:01:41.460285 kubelet[2935]: E0115 02:01:41.459924 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:01:41.461552 kubelet[2935]: E0115 02:01:41.460802 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzplf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5dcd9489f8-x7ppt_calico-apiserver(768894fc-e95b-49e5-9a90-1487b94ce02a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 02:01:41.463379 containerd[1710]: time="2026-01-15T02:01:41.461893083Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 02:01:41.463511 kubelet[2935]: E0115 02:01:41.463284 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" podUID="768894fc-e95b-49e5-9a90-1487b94ce02a" Jan 15 02:01:41.637662 systemd-networkd[1600]: cali2b2fc191187: Gained IPv6LL Jan 15 02:01:41.740911 containerd[1710]: time="2026-01-15T02:01:41.740237025Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-5x8dj,Uid:7529b612-ddf2-4fb7-9823-720a3bd71760,Namespace:calico-system,Attempt:0,}" Jan 15 02:01:41.764815 systemd-networkd[1600]: cali844445d8607: Gained IPv6LL Jan 15 02:01:41.808061 containerd[1710]: time="2026-01-15T02:01:41.808026456Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:01:41.811030 containerd[1710]: time="2026-01-15T02:01:41.810893580Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 02:01:41.811030 containerd[1710]: time="2026-01-15T02:01:41.810972745Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 15 02:01:41.811525 kubelet[2935]: E0115 02:01:41.811330 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 02:01:41.811525 kubelet[2935]: E0115 02:01:41.811375 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 02:01:41.812303 kubelet[2935]: E0115 02:01:41.811492 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ktgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hdqqp_calico-system(328d556f-d445-4769-97a7-1a5530a232c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 02:01:41.814788 kubelet[2935]: E0115 02:01:41.813938 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:01:41.863129 systemd-networkd[1600]: calic624c3923f9: Link UP Jan 15 02:01:41.864180 systemd-networkd[1600]: calic624c3923f9: Gained carrier Jan 15 02:01:41.875454 kubelet[2935]: I0115 02:01:41.875328 2935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-hpxzl" podStartSLOduration=51.875246345 podStartE2EDuration="51.875246345s" podCreationTimestamp="2026-01-15 02:00:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-15 02:01:41.124400891 +0000 UTC m=+57.502392752" watchObservedRunningTime="2026-01-15 02:01:41.875246345 +0000 UTC m=+58.253238189" Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.790 [INFO][4704] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--e5e35ee394-k8s-goldmane--666569f655--5x8dj-eth0 goldmane-666569f655- calico-system 7529b612-ddf2-4fb7-9823-720a3bd71760 853 0 2026-01-15 02:01:11 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4515-1-0-n-e5e35ee394 goldmane-666569f655-5x8dj eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calic624c3923f9 [] [] }} ContainerID="71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" Namespace="calico-system" Pod="goldmane-666569f655-5x8dj" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-goldmane--666569f655--5x8dj-" Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.790 [INFO][4704] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" Namespace="calico-system" Pod="goldmane-666569f655-5x8dj" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-goldmane--666569f655--5x8dj-eth0" Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.825 [INFO][4716] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" HandleID="k8s-pod-network.71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" Workload="ci--4515--1--0--n--e5e35ee394-k8s-goldmane--666569f655--5x8dj-eth0" Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.825 [INFO][4716] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" HandleID="k8s-pod-network.71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" Workload="ci--4515--1--0--n--e5e35ee394-k8s-goldmane--666569f655--5x8dj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f0f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-n-e5e35ee394", "pod":"goldmane-666569f655-5x8dj", "timestamp":"2026-01-15 02:01:41.825305553 +0000 UTC"}, Hostname:"ci-4515-1-0-n-e5e35ee394", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.825 [INFO][4716] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.825 [INFO][4716] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.825 [INFO][4716] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-e5e35ee394' Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.834 [INFO][4716] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.838 [INFO][4716] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.841 [INFO][4716] ipam/ipam.go 511: Trying affinity for 192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.843 [INFO][4716] ipam/ipam.go 158: Attempting to load block cidr=192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.845 [INFO][4716] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.845 [INFO][4716] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.92.64/26 handle="k8s-pod-network.71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.846 [INFO][4716] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0 Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.851 [INFO][4716] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.92.64/26 handle="k8s-pod-network.71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.857 [INFO][4716] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.92.71/26] block=192.168.92.64/26 handle="k8s-pod-network.71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.857 [INFO][4716] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.92.71/26] handle="k8s-pod-network.71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.858 [INFO][4716] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 02:01:41.878984 containerd[1710]: 2026-01-15 02:01:41.858 [INFO][4716] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.92.71/26] IPv6=[] ContainerID="71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" HandleID="k8s-pod-network.71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" Workload="ci--4515--1--0--n--e5e35ee394-k8s-goldmane--666569f655--5x8dj-eth0" Jan 15 02:01:41.879993 containerd[1710]: 2026-01-15 02:01:41.859 [INFO][4704] cni-plugin/k8s.go 418: Populated endpoint ContainerID="71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" Namespace="calico-system" Pod="goldmane-666569f655-5x8dj" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-goldmane--666569f655--5x8dj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--e5e35ee394-k8s-goldmane--666569f655--5x8dj-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"7529b612-ddf2-4fb7-9823-720a3bd71760", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 2, 1, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-e5e35ee394", ContainerID:"", Pod:"goldmane-666569f655-5x8dj", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.92.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic624c3923f9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 02:01:41.879993 containerd[1710]: 2026-01-15 02:01:41.859 [INFO][4704] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.92.71/32] ContainerID="71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" Namespace="calico-system" Pod="goldmane-666569f655-5x8dj" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-goldmane--666569f655--5x8dj-eth0" Jan 15 02:01:41.879993 containerd[1710]: 2026-01-15 02:01:41.859 [INFO][4704] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic624c3923f9 ContainerID="71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" Namespace="calico-system" Pod="goldmane-666569f655-5x8dj" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-goldmane--666569f655--5x8dj-eth0" Jan 15 02:01:41.879993 containerd[1710]: 2026-01-15 02:01:41.864 [INFO][4704] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" Namespace="calico-system" Pod="goldmane-666569f655-5x8dj" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-goldmane--666569f655--5x8dj-eth0" Jan 15 02:01:41.879993 containerd[1710]: 2026-01-15 02:01:41.864 [INFO][4704] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" Namespace="calico-system" Pod="goldmane-666569f655-5x8dj" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-goldmane--666569f655--5x8dj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--e5e35ee394-k8s-goldmane--666569f655--5x8dj-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"7529b612-ddf2-4fb7-9823-720a3bd71760", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 2, 1, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-e5e35ee394", ContainerID:"71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0", Pod:"goldmane-666569f655-5x8dj", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.92.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic624c3923f9", MAC:"06:0d:0b:01:2f:9e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 02:01:41.879993 containerd[1710]: 2026-01-15 02:01:41.873 [INFO][4704] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" Namespace="calico-system" Pod="goldmane-666569f655-5x8dj" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-goldmane--666569f655--5x8dj-eth0" Jan 15 02:01:41.891000 audit[4731]: NETFILTER_CFG table=filter:130 family=2 entries=60 op=nft_register_chain pid=4731 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 02:01:41.893249 kernel: kauditd_printk_skb: 400 callbacks suppressed Jan 15 02:01:41.893296 kernel: audit: type=1325 audit(1768442501.891:723): table=filter:130 family=2 entries=60 op=nft_register_chain pid=4731 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 02:01:41.891000 audit[4731]: SYSCALL arch=c000003e syscall=46 success=yes exit=29916 a0=3 a1=7fffa2ecd950 a2=0 a3=7fffa2ecd93c items=0 ppid=4186 pid=4731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:41.899164 kernel: audit: type=1300 audit(1768442501.891:723): arch=c000003e syscall=46 success=yes exit=29916 a0=3 a1=7fffa2ecd950 a2=0 a3=7fffa2ecd93c items=0 ppid=4186 pid=4731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:41.899215 kernel: audit: type=1327 audit(1768442501.891:723): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 02:01:41.891000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 02:01:41.916891 containerd[1710]: time="2026-01-15T02:01:41.916463213Z" level=info msg="connecting to shim 71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0" address="unix:///run/containerd/s/6f051cf4602a27d84853302ac9bed47fcb8da20595133e30485370275337cb51" namespace=k8s.io protocol=ttrpc version=3 Jan 15 02:01:41.938343 systemd[1]: Started cri-containerd-71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0.scope - libcontainer container 71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0. Jan 15 02:01:41.948000 audit: BPF prog-id=246 op=LOAD Jan 15 02:01:41.949000 audit: BPF prog-id=247 op=LOAD Jan 15 02:01:41.950708 kernel: audit: type=1334 audit(1768442501.948:724): prog-id=246 op=LOAD Jan 15 02:01:41.950761 kernel: audit: type=1334 audit(1768442501.949:725): prog-id=247 op=LOAD Jan 15 02:01:41.950777 kernel: audit: type=1300 audit(1768442501.949:725): arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178238 a2=98 a3=0 items=0 ppid=4740 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:41.949000 audit[4752]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178238 a2=98 a3=0 items=0 ppid=4740 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:41.949000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731653438616237316235663038363439363934633263386636623736 Jan 15 02:01:41.955601 kernel: audit: type=1327 audit(1768442501.949:725): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731653438616237316235663038363439363934633263386636623736 Jan 15 02:01:41.949000 audit: BPF prog-id=247 op=UNLOAD Jan 15 02:01:41.958205 kernel: audit: type=1334 audit(1768442501.949:726): prog-id=247 op=UNLOAD Jan 15 02:01:41.949000 audit[4752]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4740 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:41.960597 kernel: audit: type=1300 audit(1768442501.949:726): arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4740 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:41.949000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731653438616237316235663038363439363934633263386636623736 Jan 15 02:01:41.964403 kernel: audit: type=1327 audit(1768442501.949:726): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731653438616237316235663038363439363934633263386636623736 Jan 15 02:01:41.949000 audit: BPF prog-id=248 op=LOAD Jan 15 02:01:41.949000 audit[4752]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c000178488 a2=98 a3=0 items=0 ppid=4740 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:41.949000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731653438616237316235663038363439363934633263386636623736 Jan 15 02:01:41.949000 audit: BPF prog-id=249 op=LOAD Jan 15 02:01:41.949000 audit[4752]: SYSCALL arch=c000003e syscall=321 success=yes exit=23 a0=5 a1=c000178218 a2=98 a3=0 items=0 ppid=4740 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:41.949000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731653438616237316235663038363439363934633263386636623736 Jan 15 02:01:41.949000 audit: BPF prog-id=249 op=UNLOAD Jan 15 02:01:41.949000 audit[4752]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4740 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:41.949000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731653438616237316235663038363439363934633263386636623736 Jan 15 02:01:41.949000 audit: BPF prog-id=248 op=UNLOAD Jan 15 02:01:41.949000 audit[4752]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4740 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:41.949000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731653438616237316235663038363439363934633263386636623736 Jan 15 02:01:41.949000 audit: BPF prog-id=250 op=LOAD Jan 15 02:01:41.949000 audit[4752]: SYSCALL arch=c000003e syscall=321 success=yes exit=21 a0=5 a1=c0001786e8 a2=98 a3=0 items=0 ppid=4740 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:41.949000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731653438616237316235663038363439363934633263386636623736 Jan 15 02:01:41.993642 containerd[1710]: time="2026-01-15T02:01:41.993290806Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-5x8dj,Uid:7529b612-ddf2-4fb7-9823-720a3bd71760,Namespace:calico-system,Attempt:0,} returns sandbox id \"71e48ab71b5f08649694c2c8f6b76c39e2f737e213f2a3327baad8a1d61c53c0\"" Jan 15 02:01:41.995572 containerd[1710]: time="2026-01-15T02:01:41.995546392Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 02:01:42.016474 kubelet[2935]: E0115 02:01:42.016373 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" podUID="85d7699e-6a0a-4fd1-96b0-9365b90d23ad" Jan 15 02:01:42.016848 kubelet[2935]: E0115 02:01:42.016740 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" podUID="768894fc-e95b-49e5-9a90-1487b94ce02a" Jan 15 02:01:42.017193 kubelet[2935]: E0115 02:01:42.017131 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:01:42.149000 audit[4779]: NETFILTER_CFG table=filter:131 family=2 entries=14 op=nft_register_rule pid=4779 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:42.149000 audit[4779]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffcf0b2e710 a2=0 a3=7ffcf0b2e6fc items=0 ppid=3037 pid=4779 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:42.149000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:42.149741 systemd-networkd[1600]: calie9812df1f30: Gained IPv6LL Jan 15 02:01:42.169000 audit[4779]: NETFILTER_CFG table=nat:132 family=2 entries=56 op=nft_register_chain pid=4779 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:42.169000 audit[4779]: SYSCALL arch=c000003e syscall=46 success=yes exit=19860 a0=3 a1=7ffcf0b2e710 a2=0 a3=7ffcf0b2e6fc items=0 ppid=3037 pid=4779 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:42.169000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:42.345623 containerd[1710]: time="2026-01-15T02:01:42.345347832Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:01:42.347115 containerd[1710]: time="2026-01-15T02:01:42.347069142Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 02:01:42.347336 containerd[1710]: time="2026-01-15T02:01:42.347134540Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 15 02:01:42.348320 kubelet[2935]: E0115 02:01:42.348239 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 02:01:42.348320 kubelet[2935]: E0115 02:01:42.348276 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 02:01:42.348785 kubelet[2935]: E0115 02:01:42.348390 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hkwh7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-5x8dj_calico-system(7529b612-ddf2-4fb7-9823-720a3bd71760): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 02:01:42.349695 kubelet[2935]: E0115 02:01:42.349653 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-5x8dj" podUID="7529b612-ddf2-4fb7-9823-720a3bd71760" Jan 15 02:01:42.738097 containerd[1710]: time="2026-01-15T02:01:42.737906200Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dcd9489f8-9mlsx,Uid:19301f0b-a1cd-4987-b493-85e61d59a457,Namespace:calico-apiserver,Attempt:0,}" Jan 15 02:01:42.881266 systemd-networkd[1600]: cali2cb0fb3dab0: Link UP Jan 15 02:01:42.882324 systemd-networkd[1600]: cali2cb0fb3dab0: Gained carrier Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.796 [INFO][4782] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--9mlsx-eth0 calico-apiserver-5dcd9489f8- calico-apiserver 19301f0b-a1cd-4987-b493-85e61d59a457 852 0 2026-01-15 02:01:09 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5dcd9489f8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4515-1-0-n-e5e35ee394 calico-apiserver-5dcd9489f8-9mlsx eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali2cb0fb3dab0 [] [] }} ContainerID="8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" Namespace="calico-apiserver" Pod="calico-apiserver-5dcd9489f8-9mlsx" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--9mlsx-" Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.797 [INFO][4782] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" Namespace="calico-apiserver" Pod="calico-apiserver-5dcd9489f8-9mlsx" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--9mlsx-eth0" Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.839 [INFO][4794] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" HandleID="k8s-pod-network.8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" Workload="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--9mlsx-eth0" Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.839 [INFO][4794] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" HandleID="k8s-pod-network.8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" Workload="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--9mlsx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4fe0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4515-1-0-n-e5e35ee394", "pod":"calico-apiserver-5dcd9489f8-9mlsx", "timestamp":"2026-01-15 02:01:42.83924843 +0000 UTC"}, Hostname:"ci-4515-1-0-n-e5e35ee394", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.839 [INFO][4794] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.839 [INFO][4794] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.839 [INFO][4794] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-n-e5e35ee394' Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.846 [INFO][4794] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.851 [INFO][4794] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.855 [INFO][4794] ipam/ipam.go 511: Trying affinity for 192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.857 [INFO][4794] ipam/ipam.go 158: Attempting to load block cidr=192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.859 [INFO][4794] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.92.64/26 host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.859 [INFO][4794] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.92.64/26 handle="k8s-pod-network.8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.861 [INFO][4794] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.865 [INFO][4794] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.92.64/26 handle="k8s-pod-network.8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.874 [INFO][4794] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.92.72/26] block=192.168.92.64/26 handle="k8s-pod-network.8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.874 [INFO][4794] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.92.72/26] handle="k8s-pod-network.8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" host="ci-4515-1-0-n-e5e35ee394" Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.874 [INFO][4794] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 15 02:01:42.897769 containerd[1710]: 2026-01-15 02:01:42.875 [INFO][4794] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.92.72/26] IPv6=[] ContainerID="8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" HandleID="k8s-pod-network.8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" Workload="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--9mlsx-eth0" Jan 15 02:01:42.898597 containerd[1710]: 2026-01-15 02:01:42.878 [INFO][4782] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" Namespace="calico-apiserver" Pod="calico-apiserver-5dcd9489f8-9mlsx" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--9mlsx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--9mlsx-eth0", GenerateName:"calico-apiserver-5dcd9489f8-", Namespace:"calico-apiserver", SelfLink:"", UID:"19301f0b-a1cd-4987-b493-85e61d59a457", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 2, 1, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5dcd9489f8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-e5e35ee394", ContainerID:"", Pod:"calico-apiserver-5dcd9489f8-9mlsx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.92.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2cb0fb3dab0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 02:01:42.898597 containerd[1710]: 2026-01-15 02:01:42.878 [INFO][4782] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.92.72/32] ContainerID="8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" Namespace="calico-apiserver" Pod="calico-apiserver-5dcd9489f8-9mlsx" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--9mlsx-eth0" Jan 15 02:01:42.898597 containerd[1710]: 2026-01-15 02:01:42.878 [INFO][4782] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2cb0fb3dab0 ContainerID="8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" Namespace="calico-apiserver" Pod="calico-apiserver-5dcd9489f8-9mlsx" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--9mlsx-eth0" Jan 15 02:01:42.898597 containerd[1710]: 2026-01-15 02:01:42.882 [INFO][4782] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" Namespace="calico-apiserver" Pod="calico-apiserver-5dcd9489f8-9mlsx" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--9mlsx-eth0" Jan 15 02:01:42.898597 containerd[1710]: 2026-01-15 02:01:42.883 [INFO][4782] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" Namespace="calico-apiserver" Pod="calico-apiserver-5dcd9489f8-9mlsx" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--9mlsx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--9mlsx-eth0", GenerateName:"calico-apiserver-5dcd9489f8-", Namespace:"calico-apiserver", SelfLink:"", UID:"19301f0b-a1cd-4987-b493-85e61d59a457", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2026, time.January, 15, 2, 1, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5dcd9489f8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-n-e5e35ee394", ContainerID:"8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc", Pod:"calico-apiserver-5dcd9489f8-9mlsx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.92.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2cb0fb3dab0", MAC:"5e:5f:93:d3:45:cc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 15 02:01:42.898597 containerd[1710]: 2026-01-15 02:01:42.893 [INFO][4782] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" Namespace="calico-apiserver" Pod="calico-apiserver-5dcd9489f8-9mlsx" WorkloadEndpoint="ci--4515--1--0--n--e5e35ee394-k8s-calico--apiserver--5dcd9489f8--9mlsx-eth0" Jan 15 02:01:42.915000 audit[4807]: NETFILTER_CFG table=filter:133 family=2 entries=41 op=nft_register_chain pid=4807 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 15 02:01:42.915000 audit[4807]: SYSCALL arch=c000003e syscall=46 success=yes exit=23096 a0=3 a1=7ffe4b9387b0 a2=0 a3=7ffe4b93879c items=0 ppid=4186 pid=4807 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:42.915000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 15 02:01:42.937535 containerd[1710]: time="2026-01-15T02:01:42.937495790Z" level=info msg="connecting to shim 8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc" address="unix:///run/containerd/s/d910693ce70cc4e0e4c94d0795e1de6da1d6be387e748557e7539fd48f63326d" namespace=k8s.io protocol=ttrpc version=3 Jan 15 02:01:42.960397 systemd[1]: Started cri-containerd-8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc.scope - libcontainer container 8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc. Jan 15 02:01:42.972000 audit: BPF prog-id=251 op=LOAD Jan 15 02:01:42.972000 audit: BPF prog-id=252 op=LOAD Jan 15 02:01:42.972000 audit[4828]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130238 a2=98 a3=0 items=0 ppid=4815 pid=4828 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:42.972000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863666237616233383563333866626465633464393037653833353662 Jan 15 02:01:42.972000 audit: BPF prog-id=252 op=UNLOAD Jan 15 02:01:42.972000 audit[4828]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4815 pid=4828 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:42.972000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863666237616233383563333866626465633464393037653833353662 Jan 15 02:01:42.972000 audit: BPF prog-id=253 op=LOAD Jan 15 02:01:42.972000 audit[4828]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c000130488 a2=98 a3=0 items=0 ppid=4815 pid=4828 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:42.972000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863666237616233383563333866626465633464393037653833353662 Jan 15 02:01:42.972000 audit: BPF prog-id=254 op=LOAD Jan 15 02:01:42.972000 audit[4828]: SYSCALL arch=c000003e syscall=321 success=yes exit=22 a0=5 a1=c000130218 a2=98 a3=0 items=0 ppid=4815 pid=4828 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:42.972000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863666237616233383563333866626465633464393037653833353662 Jan 15 02:01:42.972000 audit: BPF prog-id=254 op=UNLOAD Jan 15 02:01:42.972000 audit[4828]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4815 pid=4828 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:42.972000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863666237616233383563333866626465633464393037653833353662 Jan 15 02:01:42.972000 audit: BPF prog-id=253 op=UNLOAD Jan 15 02:01:42.972000 audit[4828]: SYSCALL arch=c000003e syscall=3 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4815 pid=4828 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:42.972000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863666237616233383563333866626465633464393037653833353662 Jan 15 02:01:42.972000 audit: BPF prog-id=255 op=LOAD Jan 15 02:01:42.972000 audit[4828]: SYSCALL arch=c000003e syscall=321 success=yes exit=20 a0=5 a1=c0001306e8 a2=98 a3=0 items=0 ppid=4815 pid=4828 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:42.972000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863666237616233383563333866626465633464393037653833353662 Jan 15 02:01:43.007216 containerd[1710]: time="2026-01-15T02:01:43.006781540Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dcd9489f8-9mlsx,Uid:19301f0b-a1cd-4987-b493-85e61d59a457,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8cfb7ab385c38fbdec4d907e8356bcbea98cd80f4ec1779d7393af0b418a94dc\"" Jan 15 02:01:43.009115 containerd[1710]: time="2026-01-15T02:01:43.009087868Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 02:01:43.015085 kubelet[2935]: E0115 02:01:43.015062 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-5x8dj" podUID="7529b612-ddf2-4fb7-9823-720a3bd71760" Jan 15 02:01:43.054000 audit[4857]: NETFILTER_CFG table=filter:134 family=2 entries=14 op=nft_register_rule pid=4857 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:43.054000 audit[4857]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffd4359d8b0 a2=0 a3=7ffd4359d89c items=0 ppid=3037 pid=4857 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:43.054000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:43.058000 audit[4857]: NETFILTER_CFG table=nat:135 family=2 entries=20 op=nft_register_rule pid=4857 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:43.058000 audit[4857]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffd4359d8b0 a2=0 a3=7ffd4359d89c items=0 ppid=3037 pid=4857 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:43.058000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:43.174064 systemd-networkd[1600]: calic624c3923f9: Gained IPv6LL Jan 15 02:01:43.338515 containerd[1710]: time="2026-01-15T02:01:43.338221365Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:01:43.340745 containerd[1710]: time="2026-01-15T02:01:43.340662918Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 02:01:43.340856 containerd[1710]: time="2026-01-15T02:01:43.340803702Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 02:01:43.341454 kubelet[2935]: E0115 02:01:43.341186 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:01:43.341454 kubelet[2935]: E0115 02:01:43.341267 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:01:43.345635 kubelet[2935]: E0115 02:01:43.342613 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nzlsr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5dcd9489f8-9mlsx_calico-apiserver(19301f0b-a1cd-4987-b493-85e61d59a457): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 02:01:43.348681 kubelet[2935]: E0115 02:01:43.346272 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" podUID="19301f0b-a1cd-4987-b493-85e61d59a457" Jan 15 02:01:44.017795 kubelet[2935]: E0115 02:01:44.017746 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" podUID="19301f0b-a1cd-4987-b493-85e61d59a457" Jan 15 02:01:44.060000 audit[4864]: NETFILTER_CFG table=filter:136 family=2 entries=14 op=nft_register_rule pid=4864 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:44.060000 audit[4864]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffc6eaf57d0 a2=0 a3=7ffc6eaf57bc items=0 ppid=3037 pid=4864 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:44.060000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:44.063000 audit[4864]: NETFILTER_CFG table=nat:137 family=2 entries=20 op=nft_register_rule pid=4864 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:01:44.063000 audit[4864]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffc6eaf57d0 a2=0 a3=7ffc6eaf57bc items=0 ppid=3037 pid=4864 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:01:44.063000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:01:44.261241 systemd-networkd[1600]: cali2cb0fb3dab0: Gained IPv6LL Jan 15 02:01:53.741768 containerd[1710]: time="2026-01-15T02:01:53.741221163Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 02:01:54.067237 containerd[1710]: time="2026-01-15T02:01:54.067115846Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:01:54.069665 containerd[1710]: time="2026-01-15T02:01:54.069509475Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 02:01:54.069665 containerd[1710]: time="2026-01-15T02:01:54.069579181Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 15 02:01:54.069885 kubelet[2935]: E0115 02:01:54.069724 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 02:01:54.069885 kubelet[2935]: E0115 02:01:54.069789 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 02:01:54.071862 kubelet[2935]: E0115 02:01:54.070001 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2xk5t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7b978c6cc9-jnxw2_calico-system(85d7699e-6a0a-4fd1-96b0-9365b90d23ad): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 02:01:54.073307 kubelet[2935]: E0115 02:01:54.072047 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" podUID="85d7699e-6a0a-4fd1-96b0-9365b90d23ad" Jan 15 02:01:54.739478 containerd[1710]: time="2026-01-15T02:01:54.738797694Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 02:01:55.098979 containerd[1710]: time="2026-01-15T02:01:55.098897630Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:01:55.101243 containerd[1710]: time="2026-01-15T02:01:55.101143153Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 02:01:55.101430 containerd[1710]: time="2026-01-15T02:01:55.101195701Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 15 02:01:55.102025 kubelet[2935]: E0115 02:01:55.101634 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 02:01:55.102025 kubelet[2935]: E0115 02:01:55.101709 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 02:01:55.102025 kubelet[2935]: E0115 02:01:55.101922 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:66ec0441fba04fd9bc4d56215e73797e,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-swvrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cd9b9db69-tblzt_calico-system(0839e1a2-38f7-4739-be41-8f605565d9d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 02:01:55.106191 containerd[1710]: time="2026-01-15T02:01:55.105970743Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 02:01:55.444608 containerd[1710]: time="2026-01-15T02:01:55.444274462Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:01:55.446411 containerd[1710]: time="2026-01-15T02:01:55.446259308Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 02:01:55.446411 containerd[1710]: time="2026-01-15T02:01:55.446336926Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 15 02:01:55.446992 kubelet[2935]: E0115 02:01:55.446817 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 02:01:55.446992 kubelet[2935]: E0115 02:01:55.446934 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 02:01:55.447744 kubelet[2935]: E0115 02:01:55.447621 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-swvrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cd9b9db69-tblzt_calico-system(0839e1a2-38f7-4739-be41-8f605565d9d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 02:01:55.449091 kubelet[2935]: E0115 02:01:55.448970 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cd9b9db69-tblzt" podUID="0839e1a2-38f7-4739-be41-8f605565d9d2" Jan 15 02:01:55.745675 containerd[1710]: time="2026-01-15T02:01:55.744639466Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 02:01:56.100623 containerd[1710]: time="2026-01-15T02:01:56.100471169Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:01:56.103614 containerd[1710]: time="2026-01-15T02:01:56.103551865Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 02:01:56.103956 containerd[1710]: time="2026-01-15T02:01:56.103798835Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 02:01:56.104416 kubelet[2935]: E0115 02:01:56.104350 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:01:56.105876 kubelet[2935]: E0115 02:01:56.104431 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:01:56.105962 containerd[1710]: time="2026-01-15T02:01:56.104941918Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 02:01:56.106805 kubelet[2935]: E0115 02:01:56.106646 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nzlsr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5dcd9489f8-9mlsx_calico-apiserver(19301f0b-a1cd-4987-b493-85e61d59a457): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 02:01:56.107991 kubelet[2935]: E0115 02:01:56.107924 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" podUID="19301f0b-a1cd-4987-b493-85e61d59a457" Jan 15 02:01:56.463829 containerd[1710]: time="2026-01-15T02:01:56.462805976Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:01:56.464952 containerd[1710]: time="2026-01-15T02:01:56.464881531Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 02:01:56.465097 containerd[1710]: time="2026-01-15T02:01:56.465024238Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 15 02:01:56.465448 kubelet[2935]: E0115 02:01:56.465383 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 02:01:56.465745 kubelet[2935]: E0115 02:01:56.465625 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 02:01:56.468378 kubelet[2935]: E0115 02:01:56.468269 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ktgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hdqqp_calico-system(328d556f-d445-4769-97a7-1a5530a232c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 02:01:56.472478 containerd[1710]: time="2026-01-15T02:01:56.472426161Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 02:01:56.818895 containerd[1710]: time="2026-01-15T02:01:56.818801944Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:01:56.820728 containerd[1710]: time="2026-01-15T02:01:56.820652857Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 02:01:56.821245 containerd[1710]: time="2026-01-15T02:01:56.820782434Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 15 02:01:56.823069 kubelet[2935]: E0115 02:01:56.822454 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 02:01:56.823069 kubelet[2935]: E0115 02:01:56.822526 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 02:01:56.823069 kubelet[2935]: E0115 02:01:56.822859 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ktgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hdqqp_calico-system(328d556f-d445-4769-97a7-1a5530a232c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 02:01:56.824059 containerd[1710]: time="2026-01-15T02:01:56.823972028Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 02:01:56.824505 kubelet[2935]: E0115 02:01:56.824264 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:01:57.167809 containerd[1710]: time="2026-01-15T02:01:57.167474076Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:01:57.170087 containerd[1710]: time="2026-01-15T02:01:57.169999897Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 02:01:57.170322 containerd[1710]: time="2026-01-15T02:01:57.170143968Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 02:01:57.170525 kubelet[2935]: E0115 02:01:57.170445 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:01:57.171042 kubelet[2935]: E0115 02:01:57.170524 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:01:57.171042 kubelet[2935]: E0115 02:01:57.170739 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzplf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5dcd9489f8-x7ppt_calico-apiserver(768894fc-e95b-49e5-9a90-1487b94ce02a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 02:01:57.172624 kubelet[2935]: E0115 02:01:57.172561 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" podUID="768894fc-e95b-49e5-9a90-1487b94ce02a" Jan 15 02:01:57.740973 containerd[1710]: time="2026-01-15T02:01:57.740404274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 02:01:58.103508 containerd[1710]: time="2026-01-15T02:01:58.103099907Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:01:58.105508 containerd[1710]: time="2026-01-15T02:01:58.105446593Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 02:01:58.105797 containerd[1710]: time="2026-01-15T02:01:58.105638704Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 15 02:01:58.106193 kubelet[2935]: E0115 02:01:58.106093 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 02:01:58.106400 kubelet[2935]: E0115 02:01:58.106365 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 02:01:58.106872 kubelet[2935]: E0115 02:01:58.106764 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hkwh7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-5x8dj_calico-system(7529b612-ddf2-4fb7-9823-720a3bd71760): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 02:01:58.108621 kubelet[2935]: E0115 02:01:58.108531 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-5x8dj" podUID="7529b612-ddf2-4fb7-9823-720a3bd71760" Jan 15 02:02:04.739853 kubelet[2935]: E0115 02:02:04.739748 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" podUID="85d7699e-6a0a-4fd1-96b0-9365b90d23ad" Jan 15 02:02:06.743571 kubelet[2935]: E0115 02:02:06.743440 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cd9b9db69-tblzt" podUID="0839e1a2-38f7-4739-be41-8f605565d9d2" Jan 15 02:02:07.741727 kubelet[2935]: E0115 02:02:07.741087 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" podUID="768894fc-e95b-49e5-9a90-1487b94ce02a" Jan 15 02:02:08.738532 kubelet[2935]: E0115 02:02:08.738445 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-5x8dj" podUID="7529b612-ddf2-4fb7-9823-720a3bd71760" Jan 15 02:02:10.741792 kubelet[2935]: E0115 02:02:10.741489 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:02:11.739843 kubelet[2935]: E0115 02:02:11.739799 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" podUID="19301f0b-a1cd-4987-b493-85e61d59a457" Jan 15 02:02:15.743933 containerd[1710]: time="2026-01-15T02:02:15.743822306Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 02:02:16.102114 containerd[1710]: time="2026-01-15T02:02:16.102074313Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:02:16.104085 containerd[1710]: time="2026-01-15T02:02:16.104055315Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 02:02:16.104159 containerd[1710]: time="2026-01-15T02:02:16.104127515Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 15 02:02:16.104280 kubelet[2935]: E0115 02:02:16.104249 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 02:02:16.104638 kubelet[2935]: E0115 02:02:16.104291 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 02:02:16.104638 kubelet[2935]: E0115 02:02:16.104418 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2xk5t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7b978c6cc9-jnxw2_calico-system(85d7699e-6a0a-4fd1-96b0-9365b90d23ad): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 02:02:16.105854 kubelet[2935]: E0115 02:02:16.105831 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" podUID="85d7699e-6a0a-4fd1-96b0-9365b90d23ad" Jan 15 02:02:19.740253 containerd[1710]: time="2026-01-15T02:02:19.739913377Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 02:02:20.067573 containerd[1710]: time="2026-01-15T02:02:20.067539053Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:02:20.069821 containerd[1710]: time="2026-01-15T02:02:20.069789326Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 02:02:20.069907 containerd[1710]: time="2026-01-15T02:02:20.069866292Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 15 02:02:20.070126 kubelet[2935]: E0115 02:02:20.070098 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 02:02:20.071456 kubelet[2935]: E0115 02:02:20.070329 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 02:02:20.071456 kubelet[2935]: E0115 02:02:20.070464 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hkwh7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-5x8dj_calico-system(7529b612-ddf2-4fb7-9823-720a3bd71760): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 02:02:20.071654 kubelet[2935]: E0115 02:02:20.071635 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-5x8dj" podUID="7529b612-ddf2-4fb7-9823-720a3bd71760" Jan 15 02:02:20.737895 containerd[1710]: time="2026-01-15T02:02:20.737856795Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 02:02:21.062489 containerd[1710]: time="2026-01-15T02:02:21.062448046Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:02:21.064739 containerd[1710]: time="2026-01-15T02:02:21.064709767Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 02:02:21.064839 containerd[1710]: time="2026-01-15T02:02:21.064778201Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 02:02:21.065289 kubelet[2935]: E0115 02:02:21.065248 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:02:21.065408 kubelet[2935]: E0115 02:02:21.065288 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:02:21.065531 kubelet[2935]: E0115 02:02:21.065489 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzplf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5dcd9489f8-x7ppt_calico-apiserver(768894fc-e95b-49e5-9a90-1487b94ce02a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 02:02:21.066959 containerd[1710]: time="2026-01-15T02:02:21.066934319Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 02:02:21.067203 kubelet[2935]: E0115 02:02:21.067141 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" podUID="768894fc-e95b-49e5-9a90-1487b94ce02a" Jan 15 02:02:21.405511 containerd[1710]: time="2026-01-15T02:02:21.405302247Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:02:21.407070 containerd[1710]: time="2026-01-15T02:02:21.407039764Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 02:02:21.407289 containerd[1710]: time="2026-01-15T02:02:21.407108315Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 15 02:02:21.407437 kubelet[2935]: E0115 02:02:21.407247 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 02:02:21.407437 kubelet[2935]: E0115 02:02:21.407285 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 02:02:21.407437 kubelet[2935]: E0115 02:02:21.407377 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:66ec0441fba04fd9bc4d56215e73797e,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-swvrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cd9b9db69-tblzt_calico-system(0839e1a2-38f7-4739-be41-8f605565d9d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 02:02:21.409671 containerd[1710]: time="2026-01-15T02:02:21.409650697Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 02:02:21.743233 containerd[1710]: time="2026-01-15T02:02:21.742412370Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:02:21.746147 containerd[1710]: time="2026-01-15T02:02:21.746011703Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 02:02:21.746147 containerd[1710]: time="2026-01-15T02:02:21.746112491Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 15 02:02:21.747400 kubelet[2935]: E0115 02:02:21.746501 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 02:02:21.747400 kubelet[2935]: E0115 02:02:21.746738 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 02:02:21.747400 kubelet[2935]: E0115 02:02:21.746934 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-swvrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cd9b9db69-tblzt_calico-system(0839e1a2-38f7-4739-be41-8f605565d9d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 02:02:21.748468 kubelet[2935]: E0115 02:02:21.748395 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cd9b9db69-tblzt" podUID="0839e1a2-38f7-4739-be41-8f605565d9d2" Jan 15 02:02:23.746138 containerd[1710]: time="2026-01-15T02:02:23.745900123Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 02:02:24.096841 containerd[1710]: time="2026-01-15T02:02:24.096772150Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:02:24.099109 containerd[1710]: time="2026-01-15T02:02:24.098976988Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 02:02:24.099109 containerd[1710]: time="2026-01-15T02:02:24.099074858Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 15 02:02:24.099584 kubelet[2935]: E0115 02:02:24.099491 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 02:02:24.099584 kubelet[2935]: E0115 02:02:24.099554 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 02:02:24.100622 kubelet[2935]: E0115 02:02:24.100206 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ktgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hdqqp_calico-system(328d556f-d445-4769-97a7-1a5530a232c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 02:02:24.103082 containerd[1710]: time="2026-01-15T02:02:24.102816505Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 02:02:24.442695 containerd[1710]: time="2026-01-15T02:02:24.442266080Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:02:24.444478 containerd[1710]: time="2026-01-15T02:02:24.444397009Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 02:02:24.444478 containerd[1710]: time="2026-01-15T02:02:24.444446664Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 15 02:02:24.444706 kubelet[2935]: E0115 02:02:24.444676 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 02:02:24.444792 kubelet[2935]: E0115 02:02:24.444776 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 02:02:24.444941 kubelet[2935]: E0115 02:02:24.444911 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ktgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hdqqp_calico-system(328d556f-d445-4769-97a7-1a5530a232c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 02:02:24.446409 kubelet[2935]: E0115 02:02:24.446387 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:02:24.739177 containerd[1710]: time="2026-01-15T02:02:24.738142594Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 02:02:25.064205 containerd[1710]: time="2026-01-15T02:02:25.063317874Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:02:25.066099 containerd[1710]: time="2026-01-15T02:02:25.065968701Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 02:02:25.066099 containerd[1710]: time="2026-01-15T02:02:25.066051175Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 02:02:25.066637 kubelet[2935]: E0115 02:02:25.066558 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:02:25.066760 kubelet[2935]: E0115 02:02:25.066654 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:02:25.066996 kubelet[2935]: E0115 02:02:25.066908 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nzlsr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5dcd9489f8-9mlsx_calico-apiserver(19301f0b-a1cd-4987-b493-85e61d59a457): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 02:02:25.068441 kubelet[2935]: E0115 02:02:25.068383 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" podUID="19301f0b-a1cd-4987-b493-85e61d59a457" Jan 15 02:02:30.738653 kubelet[2935]: E0115 02:02:30.738448 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" podUID="85d7699e-6a0a-4fd1-96b0-9365b90d23ad" Jan 15 02:02:31.740590 kubelet[2935]: E0115 02:02:31.739935 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-5x8dj" podUID="7529b612-ddf2-4fb7-9823-720a3bd71760" Jan 15 02:02:31.744424 kubelet[2935]: E0115 02:02:31.743531 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" podUID="768894fc-e95b-49e5-9a90-1487b94ce02a" Jan 15 02:02:35.743487 kubelet[2935]: E0115 02:02:35.743445 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cd9b9db69-tblzt" podUID="0839e1a2-38f7-4739-be41-8f605565d9d2" Jan 15 02:02:36.743550 kubelet[2935]: E0115 02:02:36.743391 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:02:38.738315 kubelet[2935]: E0115 02:02:38.738275 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" podUID="19301f0b-a1cd-4987-b493-85e61d59a457" Jan 15 02:02:41.740040 kubelet[2935]: E0115 02:02:41.739986 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" podUID="85d7699e-6a0a-4fd1-96b0-9365b90d23ad" Jan 15 02:02:42.738859 kubelet[2935]: E0115 02:02:42.738724 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" podUID="768894fc-e95b-49e5-9a90-1487b94ce02a" Jan 15 02:02:43.739272 kubelet[2935]: E0115 02:02:43.739236 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-5x8dj" podUID="7529b612-ddf2-4fb7-9823-720a3bd71760" Jan 15 02:02:47.746580 kubelet[2935]: E0115 02:02:47.745952 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cd9b9db69-tblzt" podUID="0839e1a2-38f7-4739-be41-8f605565d9d2" Jan 15 02:02:48.738918 kubelet[2935]: E0115 02:02:48.738833 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:02:49.739425 kubelet[2935]: E0115 02:02:49.739298 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" podUID="19301f0b-a1cd-4987-b493-85e61d59a457" Jan 15 02:02:56.738796 kubelet[2935]: E0115 02:02:56.738503 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" podUID="768894fc-e95b-49e5-9a90-1487b94ce02a" Jan 15 02:02:56.739700 containerd[1710]: time="2026-01-15T02:02:56.739432598Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 02:02:57.093481 containerd[1710]: time="2026-01-15T02:02:57.093282329Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:02:57.096273 containerd[1710]: time="2026-01-15T02:02:57.095572433Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 02:02:57.096273 containerd[1710]: time="2026-01-15T02:02:57.095654495Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 15 02:02:57.096434 kubelet[2935]: E0115 02:02:57.095884 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 02:02:57.096434 kubelet[2935]: E0115 02:02:57.096012 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 02:02:57.096434 kubelet[2935]: E0115 02:02:57.096238 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2xk5t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7b978c6cc9-jnxw2_calico-system(85d7699e-6a0a-4fd1-96b0-9365b90d23ad): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 02:02:57.097828 kubelet[2935]: E0115 02:02:57.097777 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" podUID="85d7699e-6a0a-4fd1-96b0-9365b90d23ad" Jan 15 02:02:58.739580 kubelet[2935]: E0115 02:02:58.739381 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-5x8dj" podUID="7529b612-ddf2-4fb7-9823-720a3bd71760" Jan 15 02:03:01.740112 kubelet[2935]: E0115 02:03:01.740079 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" podUID="19301f0b-a1cd-4987-b493-85e61d59a457" Jan 15 02:03:02.737646 containerd[1710]: time="2026-01-15T02:03:02.737608240Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 02:03:03.081350 containerd[1710]: time="2026-01-15T02:03:03.081216309Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:03:03.084172 containerd[1710]: time="2026-01-15T02:03:03.083143390Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 02:03:03.084329 containerd[1710]: time="2026-01-15T02:03:03.083170993Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 15 02:03:03.084539 kubelet[2935]: E0115 02:03:03.084478 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 02:03:03.084539 kubelet[2935]: E0115 02:03:03.084520 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 02:03:03.085251 kubelet[2935]: E0115 02:03:03.084916 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:66ec0441fba04fd9bc4d56215e73797e,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-swvrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cd9b9db69-tblzt_calico-system(0839e1a2-38f7-4739-be41-8f605565d9d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 02:03:03.087037 containerd[1710]: time="2026-01-15T02:03:03.086880403Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 02:03:03.440750 containerd[1710]: time="2026-01-15T02:03:03.440020821Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:03:03.443455 containerd[1710]: time="2026-01-15T02:03:03.443271690Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 02:03:03.443455 containerd[1710]: time="2026-01-15T02:03:03.443389596Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 15 02:03:03.443894 kubelet[2935]: E0115 02:03:03.443725 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 02:03:03.443894 kubelet[2935]: E0115 02:03:03.443805 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 02:03:03.444206 kubelet[2935]: E0115 02:03:03.444032 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-swvrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cd9b9db69-tblzt_calico-system(0839e1a2-38f7-4739-be41-8f605565d9d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 02:03:03.445771 kubelet[2935]: E0115 02:03:03.445608 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cd9b9db69-tblzt" podUID="0839e1a2-38f7-4739-be41-8f605565d9d2" Jan 15 02:03:03.743115 kubelet[2935]: E0115 02:03:03.742301 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:03:07.740469 containerd[1710]: time="2026-01-15T02:03:07.740413988Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 02:03:08.085247 containerd[1710]: time="2026-01-15T02:03:08.085131388Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:03:08.087458 containerd[1710]: time="2026-01-15T02:03:08.087335909Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 02:03:08.087458 containerd[1710]: time="2026-01-15T02:03:08.087415485Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 02:03:08.088233 kubelet[2935]: E0115 02:03:08.087647 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:03:08.088233 kubelet[2935]: E0115 02:03:08.087690 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:03:08.088233 kubelet[2935]: E0115 02:03:08.087799 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzplf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5dcd9489f8-x7ppt_calico-apiserver(768894fc-e95b-49e5-9a90-1487b94ce02a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 02:03:08.089404 kubelet[2935]: E0115 02:03:08.089313 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" podUID="768894fc-e95b-49e5-9a90-1487b94ce02a" Jan 15 02:03:12.740572 containerd[1710]: time="2026-01-15T02:03:12.738799659Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 02:03:12.741898 kubelet[2935]: E0115 02:03:12.741850 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" podUID="85d7699e-6a0a-4fd1-96b0-9365b90d23ad" Jan 15 02:03:13.080167 containerd[1710]: time="2026-01-15T02:03:13.079089923Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:03:13.082596 containerd[1710]: time="2026-01-15T02:03:13.082500594Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 02:03:13.082596 containerd[1710]: time="2026-01-15T02:03:13.082570254Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 15 02:03:13.082798 kubelet[2935]: E0115 02:03:13.082764 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 02:03:13.082868 kubelet[2935]: E0115 02:03:13.082858 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 02:03:13.083051 kubelet[2935]: E0115 02:03:13.083019 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hkwh7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-5x8dj_calico-system(7529b612-ddf2-4fb7-9823-720a3bd71760): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 02:03:13.086307 kubelet[2935]: E0115 02:03:13.086277 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-5x8dj" podUID="7529b612-ddf2-4fb7-9823-720a3bd71760" Jan 15 02:03:13.743030 containerd[1710]: time="2026-01-15T02:03:13.741636154Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 02:03:14.075856 containerd[1710]: time="2026-01-15T02:03:14.075817268Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:03:14.077698 containerd[1710]: time="2026-01-15T02:03:14.077667142Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 02:03:14.077801 containerd[1710]: time="2026-01-15T02:03:14.077734424Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 02:03:14.078048 kubelet[2935]: E0115 02:03:14.078004 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:03:14.079681 kubelet[2935]: E0115 02:03:14.078648 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:03:14.079681 kubelet[2935]: E0115 02:03:14.078834 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nzlsr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5dcd9489f8-9mlsx_calico-apiserver(19301f0b-a1cd-4987-b493-85e61d59a457): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 02:03:14.080099 kubelet[2935]: E0115 02:03:14.080044 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" podUID="19301f0b-a1cd-4987-b493-85e61d59a457" Jan 15 02:03:14.740460 containerd[1710]: time="2026-01-15T02:03:14.740208460Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 02:03:14.740826 kubelet[2935]: E0115 02:03:14.740775 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cd9b9db69-tblzt" podUID="0839e1a2-38f7-4739-be41-8f605565d9d2" Jan 15 02:03:14.807575 systemd[1]: Started sshd@9-10.0.1.164:22-4.153.228.146:55016.service - OpenSSH per-connection server daemon (4.153.228.146:55016). Jan 15 02:03:14.806000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.1.164:22-4.153.228.146:55016 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:14.808701 kernel: kauditd_printk_skb: 58 callbacks suppressed Jan 15 02:03:14.808780 kernel: audit: type=1130 audit(1768442594.806:747): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.1.164:22-4.153.228.146:55016 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:15.075309 containerd[1710]: time="2026-01-15T02:03:15.075216082Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:03:15.078031 containerd[1710]: time="2026-01-15T02:03:15.077890785Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 02:03:15.078378 kubelet[2935]: E0115 02:03:15.078298 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 02:03:15.078951 kubelet[2935]: E0115 02:03:15.078402 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 02:03:15.078951 kubelet[2935]: E0115 02:03:15.078772 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ktgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hdqqp_calico-system(328d556f-d445-4769-97a7-1a5530a232c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 02:03:15.079546 containerd[1710]: time="2026-01-15T02:03:15.077981964Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 15 02:03:15.083248 containerd[1710]: time="2026-01-15T02:03:15.082315846Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 02:03:15.369000 audit[5045]: USER_ACCT pid=5045 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:15.374942 sshd[5045]: Accepted publickey for core from 4.153.228.146 port 55016 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:03:15.375928 kernel: audit: type=1101 audit(1768442595.369:748): pid=5045 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:15.375000 audit[5045]: CRED_ACQ pid=5045 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:15.381393 kernel: audit: type=1103 audit(1768442595.375:749): pid=5045 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:15.381473 kernel: audit: type=1006 audit(1768442595.375:750): pid=5045 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Jan 15 02:03:15.382996 sshd-session[5045]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:03:15.375000 audit[5045]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc563ab640 a2=3 a3=0 items=0 ppid=1 pid=5045 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:15.375000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:15.388579 kernel: audit: type=1300 audit(1768442595.375:750): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc563ab640 a2=3 a3=0 items=0 ppid=1 pid=5045 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:15.388652 kernel: audit: type=1327 audit(1768442595.375:750): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:15.396446 systemd-logind[1687]: New session 10 of user core. Jan 15 02:03:15.401386 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 15 02:03:15.403000 audit[5045]: USER_START pid=5045 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:15.407000 audit[5048]: CRED_ACQ pid=5048 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:15.409704 kernel: audit: type=1105 audit(1768442595.403:751): pid=5045 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:15.409768 kernel: audit: type=1103 audit(1768442595.407:752): pid=5048 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:15.417443 containerd[1710]: time="2026-01-15T02:03:15.417296028Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:03:15.419335 containerd[1710]: time="2026-01-15T02:03:15.419294994Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 02:03:15.419416 containerd[1710]: time="2026-01-15T02:03:15.419371874Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 15 02:03:15.419529 kubelet[2935]: E0115 02:03:15.419497 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 02:03:15.419573 kubelet[2935]: E0115 02:03:15.419540 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 02:03:15.419861 kubelet[2935]: E0115 02:03:15.419830 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ktgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hdqqp_calico-system(328d556f-d445-4769-97a7-1a5530a232c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 02:03:15.420998 kubelet[2935]: E0115 02:03:15.420962 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:03:15.774304 sshd[5048]: Connection closed by 4.153.228.146 port 55016 Jan 15 02:03:15.776418 sshd-session[5045]: pam_unix(sshd:session): session closed for user core Jan 15 02:03:15.777000 audit[5045]: USER_END pid=5045 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:15.785168 kernel: audit: type=1106 audit(1768442595.777:753): pid=5045 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:15.785182 systemd[1]: sshd@9-10.0.1.164:22-4.153.228.146:55016.service: Deactivated successfully. Jan 15 02:03:15.787069 systemd-logind[1687]: Session 10 logged out. Waiting for processes to exit. Jan 15 02:03:15.787543 systemd[1]: session-10.scope: Deactivated successfully. Jan 15 02:03:15.777000 audit[5045]: CRED_DISP pid=5045 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:15.789375 systemd-logind[1687]: Removed session 10. Jan 15 02:03:15.793200 kernel: audit: type=1104 audit(1768442595.777:754): pid=5045 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:15.784000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.1.164:22-4.153.228.146:55016 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:20.885417 systemd[1]: Started sshd@10-10.0.1.164:22-4.153.228.146:55024.service - OpenSSH per-connection server daemon (4.153.228.146:55024). Jan 15 02:03:20.884000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.1.164:22-4.153.228.146:55024 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:20.887678 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 15 02:03:20.887720 kernel: audit: type=1130 audit(1768442600.884:756): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.1.164:22-4.153.228.146:55024 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:21.432000 audit[5068]: USER_ACCT pid=5068 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:21.436419 sshd[5068]: Accepted publickey for core from 4.153.228.146 port 55024 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:03:21.441534 sshd-session[5068]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:03:21.447229 kernel: audit: type=1101 audit(1768442601.432:757): pid=5068 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:21.438000 audit[5068]: CRED_ACQ pid=5068 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:21.458354 kernel: audit: type=1103 audit(1768442601.438:758): pid=5068 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:21.458527 kernel: audit: type=1006 audit(1768442601.438:759): pid=5068 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 15 02:03:21.438000 audit[5068]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe449dc710 a2=3 a3=0 items=0 ppid=1 pid=5068 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:21.438000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:21.474305 kernel: audit: type=1300 audit(1768442601.438:759): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe449dc710 a2=3 a3=0 items=0 ppid=1 pid=5068 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:21.474447 kernel: audit: type=1327 audit(1768442601.438:759): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:21.481231 systemd-logind[1687]: New session 11 of user core. Jan 15 02:03:21.491636 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 15 02:03:21.498000 audit[5068]: USER_START pid=5068 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:21.505000 audit[5073]: CRED_ACQ pid=5073 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:21.512993 kernel: audit: type=1105 audit(1768442601.498:760): pid=5068 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:21.513275 kernel: audit: type=1103 audit(1768442601.505:761): pid=5073 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:21.830230 sshd[5073]: Connection closed by 4.153.228.146 port 55024 Jan 15 02:03:21.833139 sshd-session[5068]: pam_unix(sshd:session): session closed for user core Jan 15 02:03:21.832000 audit[5068]: USER_END pid=5068 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:21.836787 systemd[1]: sshd@10-10.0.1.164:22-4.153.228.146:55024.service: Deactivated successfully. Jan 15 02:03:21.838899 systemd[1]: session-11.scope: Deactivated successfully. Jan 15 02:03:21.839169 kernel: audit: type=1106 audit(1768442601.832:762): pid=5068 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:21.833000 audit[5068]: CRED_DISP pid=5068 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:21.843181 kernel: audit: type=1104 audit(1768442601.833:763): pid=5068 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:21.843675 systemd-logind[1687]: Session 11 logged out. Waiting for processes to exit. Jan 15 02:03:21.836000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.1.164:22-4.153.228.146:55024 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:21.844755 systemd-logind[1687]: Removed session 11. Jan 15 02:03:22.740211 kubelet[2935]: E0115 02:03:22.740132 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" podUID="768894fc-e95b-49e5-9a90-1487b94ce02a" Jan 15 02:03:23.740122 kubelet[2935]: E0115 02:03:23.740049 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-5x8dj" podUID="7529b612-ddf2-4fb7-9823-720a3bd71760" Jan 15 02:03:25.745952 kubelet[2935]: E0115 02:03:25.745748 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" podUID="85d7699e-6a0a-4fd1-96b0-9365b90d23ad" Jan 15 02:03:26.943452 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 15 02:03:26.943545 kernel: audit: type=1130 audit(1768442606.938:765): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.1.164:22-4.153.228.146:50948 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:26.938000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.1.164:22-4.153.228.146:50948 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:26.938412 systemd[1]: Started sshd@11-10.0.1.164:22-4.153.228.146:50948.service - OpenSSH per-connection server daemon (4.153.228.146:50948). Jan 15 02:03:27.471000 audit[5085]: USER_ACCT pid=5085 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:27.474222 sshd[5085]: Accepted publickey for core from 4.153.228.146 port 50948 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:03:27.478384 sshd-session[5085]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:03:27.476000 audit[5085]: CRED_ACQ pid=5085 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:27.482191 kernel: audit: type=1101 audit(1768442607.471:766): pid=5085 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:27.482310 kernel: audit: type=1103 audit(1768442607.476:767): pid=5085 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:27.488247 kernel: audit: type=1006 audit(1768442607.476:768): pid=5085 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Jan 15 02:03:27.476000 audit[5085]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd5aef6410 a2=3 a3=0 items=0 ppid=1 pid=5085 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:27.493938 kernel: audit: type=1300 audit(1768442607.476:768): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd5aef6410 a2=3 a3=0 items=0 ppid=1 pid=5085 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:27.476000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:27.498653 kernel: audit: type=1327 audit(1768442607.476:768): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:27.506408 systemd-logind[1687]: New session 12 of user core. Jan 15 02:03:27.512875 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 15 02:03:27.519000 audit[5085]: USER_START pid=5085 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:27.526188 kernel: audit: type=1105 audit(1768442607.519:769): pid=5085 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:27.528000 audit[5088]: CRED_ACQ pid=5088 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:27.534223 kernel: audit: type=1103 audit(1768442607.528:770): pid=5088 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:27.881247 sshd[5088]: Connection closed by 4.153.228.146 port 50948 Jan 15 02:03:27.883641 sshd-session[5085]: pam_unix(sshd:session): session closed for user core Jan 15 02:03:27.885000 audit[5085]: USER_END pid=5085 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:27.887961 systemd-logind[1687]: Session 12 logged out. Waiting for processes to exit. Jan 15 02:03:27.889941 systemd[1]: sshd@11-10.0.1.164:22-4.153.228.146:50948.service: Deactivated successfully. Jan 15 02:03:27.891176 kernel: audit: type=1106 audit(1768442607.885:771): pid=5085 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:27.886000 audit[5085]: CRED_DISP pid=5085 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:27.892968 systemd[1]: session-12.scope: Deactivated successfully. Jan 15 02:03:27.895112 systemd-logind[1687]: Removed session 12. Jan 15 02:03:27.896473 kernel: audit: type=1104 audit(1768442607.886:772): pid=5085 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:27.890000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.1.164:22-4.153.228.146:50948 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:27.998000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.1.164:22-4.153.228.146:50964 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:27.998335 systemd[1]: Started sshd@12-10.0.1.164:22-4.153.228.146:50964.service - OpenSSH per-connection server daemon (4.153.228.146:50964). Jan 15 02:03:28.543000 audit[5101]: USER_ACCT pid=5101 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:28.543614 sshd[5101]: Accepted publickey for core from 4.153.228.146 port 50964 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:03:28.544000 audit[5101]: CRED_ACQ pid=5101 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:28.544000 audit[5101]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffff2099170 a2=3 a3=0 items=0 ppid=1 pid=5101 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:28.544000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:28.545756 sshd-session[5101]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:03:28.552487 systemd-logind[1687]: New session 13 of user core. Jan 15 02:03:28.559367 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 15 02:03:28.564000 audit[5101]: USER_START pid=5101 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:28.567000 audit[5104]: CRED_ACQ pid=5104 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:28.738962 kubelet[2935]: E0115 02:03:28.738651 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cd9b9db69-tblzt" podUID="0839e1a2-38f7-4739-be41-8f605565d9d2" Jan 15 02:03:28.980427 sshd[5104]: Connection closed by 4.153.228.146 port 50964 Jan 15 02:03:28.980713 sshd-session[5101]: pam_unix(sshd:session): session closed for user core Jan 15 02:03:28.982000 audit[5101]: USER_END pid=5101 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:28.982000 audit[5101]: CRED_DISP pid=5101 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:28.985819 systemd[1]: sshd@12-10.0.1.164:22-4.153.228.146:50964.service: Deactivated successfully. Jan 15 02:03:28.986499 systemd-logind[1687]: Session 13 logged out. Waiting for processes to exit. Jan 15 02:03:28.986000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.1.164:22-4.153.228.146:50964 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:28.989938 systemd[1]: session-13.scope: Deactivated successfully. Jan 15 02:03:28.993845 systemd-logind[1687]: Removed session 13. Jan 15 02:03:29.105000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.1.164:22-4.153.228.146:50970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:29.104945 systemd[1]: Started sshd@13-10.0.1.164:22-4.153.228.146:50970.service - OpenSSH per-connection server daemon (4.153.228.146:50970). Jan 15 02:03:29.658883 sshd[5114]: Accepted publickey for core from 4.153.228.146 port 50970 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:03:29.658000 audit[5114]: USER_ACCT pid=5114 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:29.660000 audit[5114]: CRED_ACQ pid=5114 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:29.660000 audit[5114]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffca04f38d0 a2=3 a3=0 items=0 ppid=1 pid=5114 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:29.660000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:29.661987 sshd-session[5114]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:03:29.669071 systemd-logind[1687]: New session 14 of user core. Jan 15 02:03:29.675553 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 15 02:03:29.681000 audit[5114]: USER_START pid=5114 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:29.685000 audit[5117]: CRED_ACQ pid=5117 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:29.740359 kubelet[2935]: E0115 02:03:29.740308 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" podUID="19301f0b-a1cd-4987-b493-85e61d59a457" Jan 15 02:03:30.078311 sshd[5117]: Connection closed by 4.153.228.146 port 50970 Jan 15 02:03:30.079347 sshd-session[5114]: pam_unix(sshd:session): session closed for user core Jan 15 02:03:30.080000 audit[5114]: USER_END pid=5114 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:30.080000 audit[5114]: CRED_DISP pid=5114 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:30.083861 systemd[1]: sshd@13-10.0.1.164:22-4.153.228.146:50970.service: Deactivated successfully. Jan 15 02:03:30.083000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.1.164:22-4.153.228.146:50970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:30.087110 systemd[1]: session-14.scope: Deactivated successfully. Jan 15 02:03:30.088707 systemd-logind[1687]: Session 14 logged out. Waiting for processes to exit. Jan 15 02:03:30.089995 systemd-logind[1687]: Removed session 14. Jan 15 02:03:30.739666 kubelet[2935]: E0115 02:03:30.739584 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:03:35.187620 systemd[1]: Started sshd@14-10.0.1.164:22-4.153.228.146:40414.service - OpenSSH per-connection server daemon (4.153.228.146:40414). Jan 15 02:03:35.186000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.1.164:22-4.153.228.146:40414 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:35.188887 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 15 02:03:35.188931 kernel: audit: type=1130 audit(1768442615.186:792): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.1.164:22-4.153.228.146:40414 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:35.713000 audit[5133]: USER_ACCT pid=5133 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:35.717546 sshd[5133]: Accepted publickey for core from 4.153.228.146 port 40414 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:03:35.722494 sshd-session[5133]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:03:35.720000 audit[5133]: CRED_ACQ pid=5133 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:35.731598 kernel: audit: type=1101 audit(1768442615.713:793): pid=5133 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:35.731746 kernel: audit: type=1103 audit(1768442615.720:794): pid=5133 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:35.755225 kernel: audit: type=1006 audit(1768442615.720:795): pid=5133 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Jan 15 02:03:35.752522 systemd-logind[1687]: New session 15 of user core. Jan 15 02:03:35.756324 kubelet[2935]: E0115 02:03:35.756268 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-5x8dj" podUID="7529b612-ddf2-4fb7-9823-720a3bd71760" Jan 15 02:03:35.758465 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 15 02:03:35.720000 audit[5133]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffee10433b0 a2=3 a3=0 items=0 ppid=1 pid=5133 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:35.774223 kernel: audit: type=1300 audit(1768442615.720:795): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffee10433b0 a2=3 a3=0 items=0 ppid=1 pid=5133 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:35.720000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:35.778181 kernel: audit: type=1327 audit(1768442615.720:795): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:35.777000 audit[5133]: USER_START pid=5133 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:35.788264 kernel: audit: type=1105 audit(1768442615.777:796): pid=5133 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:35.790000 audit[5136]: CRED_ACQ pid=5136 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:35.800214 kernel: audit: type=1103 audit(1768442615.790:797): pid=5136 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:36.122198 sshd[5136]: Connection closed by 4.153.228.146 port 40414 Jan 15 02:03:36.121582 sshd-session[5133]: pam_unix(sshd:session): session closed for user core Jan 15 02:03:36.121000 audit[5133]: USER_END pid=5133 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:36.129256 kernel: audit: type=1106 audit(1768442616.121:798): pid=5133 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:36.121000 audit[5133]: CRED_DISP pid=5133 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:36.132583 systemd-logind[1687]: Session 15 logged out. Waiting for processes to exit. Jan 15 02:03:36.133207 kernel: audit: type=1104 audit(1768442616.121:799): pid=5133 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:36.134470 systemd[1]: sshd@14-10.0.1.164:22-4.153.228.146:40414.service: Deactivated successfully. Jan 15 02:03:36.133000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.1.164:22-4.153.228.146:40414 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:36.136800 systemd[1]: session-15.scope: Deactivated successfully. Jan 15 02:03:36.138367 systemd-logind[1687]: Removed session 15. Jan 15 02:03:36.738198 kubelet[2935]: E0115 02:03:36.738109 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" podUID="768894fc-e95b-49e5-9a90-1487b94ce02a" Jan 15 02:03:40.738018 kubelet[2935]: E0115 02:03:40.737939 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" podUID="85d7699e-6a0a-4fd1-96b0-9365b90d23ad" Jan 15 02:03:41.242735 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 15 02:03:41.242835 kernel: audit: type=1130 audit(1768442621.234:801): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.1.164:22-4.153.228.146:40420 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:41.234000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.1.164:22-4.153.228.146:40420 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:41.235633 systemd[1]: Started sshd@15-10.0.1.164:22-4.153.228.146:40420.service - OpenSSH per-connection server daemon (4.153.228.146:40420). Jan 15 02:03:41.782000 audit[5173]: USER_ACCT pid=5173 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:41.785201 sshd[5173]: Accepted publickey for core from 4.153.228.146 port 40420 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:03:41.787444 sshd-session[5173]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:03:41.794178 kernel: audit: type=1101 audit(1768442621.782:802): pid=5173 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:41.782000 audit[5173]: CRED_ACQ pid=5173 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:41.805465 kernel: audit: type=1103 audit(1768442621.782:803): pid=5173 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:41.805552 kernel: audit: type=1006 audit(1768442621.782:804): pid=5173 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Jan 15 02:03:41.782000 audit[5173]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdb58b3bd0 a2=3 a3=0 items=0 ppid=1 pid=5173 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:41.817184 kernel: audit: type=1300 audit(1768442621.782:804): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdb58b3bd0 a2=3 a3=0 items=0 ppid=1 pid=5173 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:41.782000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:41.821891 kernel: audit: type=1327 audit(1768442621.782:804): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:41.822021 systemd-logind[1687]: New session 16 of user core. Jan 15 02:03:41.826577 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 15 02:03:41.828000 audit[5173]: USER_START pid=5173 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:41.828000 audit[5176]: CRED_ACQ pid=5176 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:41.839054 kernel: audit: type=1105 audit(1768442621.828:805): pid=5173 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:41.839103 kernel: audit: type=1103 audit(1768442621.828:806): pid=5176 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:42.262214 sshd[5176]: Connection closed by 4.153.228.146 port 40420 Jan 15 02:03:42.264447 sshd-session[5173]: pam_unix(sshd:session): session closed for user core Jan 15 02:03:42.266000 audit[5173]: USER_END pid=5173 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:42.274933 systemd[1]: sshd@15-10.0.1.164:22-4.153.228.146:40420.service: Deactivated successfully. Jan 15 02:03:42.266000 audit[5173]: CRED_DISP pid=5173 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:42.281454 systemd[1]: session-16.scope: Deactivated successfully. Jan 15 02:03:42.283755 kernel: audit: type=1106 audit(1768442622.266:807): pid=5173 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:42.283855 kernel: audit: type=1104 audit(1768442622.266:808): pid=5173 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:42.287076 systemd-logind[1687]: Session 16 logged out. Waiting for processes to exit. Jan 15 02:03:42.274000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.1.164:22-4.153.228.146:40420 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:42.291170 systemd-logind[1687]: Removed session 16. Jan 15 02:03:42.738472 kubelet[2935]: E0115 02:03:42.738402 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:03:43.745967 kubelet[2935]: E0115 02:03:43.745778 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cd9b9db69-tblzt" podUID="0839e1a2-38f7-4739-be41-8f605565d9d2" Jan 15 02:03:44.739361 kubelet[2935]: E0115 02:03:44.739320 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" podUID="19301f0b-a1cd-4987-b493-85e61d59a457" Jan 15 02:03:47.372246 systemd[1]: Started sshd@16-10.0.1.164:22-4.153.228.146:43310.service - OpenSSH per-connection server daemon (4.153.228.146:43310). Jan 15 02:03:47.375372 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 15 02:03:47.375399 kernel: audit: type=1130 audit(1768442627.371:810): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.1.164:22-4.153.228.146:43310 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:47.371000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.1.164:22-4.153.228.146:43310 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:47.738975 kubelet[2935]: E0115 02:03:47.738350 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-5x8dj" podUID="7529b612-ddf2-4fb7-9823-720a3bd71760" Jan 15 02:03:47.921000 audit[5190]: USER_ACCT pid=5190 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:47.926384 sshd[5190]: Accepted publickey for core from 4.153.228.146 port 43310 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:03:47.930826 sshd-session[5190]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:03:47.937313 kernel: audit: type=1101 audit(1768442627.921:811): pid=5190 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:47.937561 kernel: audit: type=1103 audit(1768442627.928:812): pid=5190 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:47.928000 audit[5190]: CRED_ACQ pid=5190 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:47.950802 kernel: audit: type=1006 audit(1768442627.928:813): pid=5190 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Jan 15 02:03:47.928000 audit[5190]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdd9337df0 a2=3 a3=0 items=0 ppid=1 pid=5190 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:47.956651 systemd-logind[1687]: New session 17 of user core. Jan 15 02:03:47.959759 kernel: audit: type=1300 audit(1768442627.928:813): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdd9337df0 a2=3 a3=0 items=0 ppid=1 pid=5190 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:47.928000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:47.963957 kernel: audit: type=1327 audit(1768442627.928:813): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:47.967837 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 15 02:03:47.976000 audit[5190]: USER_START pid=5190 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:47.980000 audit[5193]: CRED_ACQ pid=5193 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:47.987319 kernel: audit: type=1105 audit(1768442627.976:814): pid=5190 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:47.987396 kernel: audit: type=1103 audit(1768442627.980:815): pid=5193 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:48.358008 sshd[5193]: Connection closed by 4.153.228.146 port 43310 Jan 15 02:03:48.359532 sshd-session[5190]: pam_unix(sshd:session): session closed for user core Jan 15 02:03:48.362000 audit[5190]: USER_END pid=5190 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:48.375284 kernel: audit: type=1106 audit(1768442628.362:816): pid=5190 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:48.362000 audit[5190]: CRED_DISP pid=5190 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:48.380536 systemd[1]: sshd@16-10.0.1.164:22-4.153.228.146:43310.service: Deactivated successfully. Jan 15 02:03:48.383725 systemd-logind[1687]: Session 17 logged out. Waiting for processes to exit. Jan 15 02:03:48.384271 kernel: audit: type=1104 audit(1768442628.362:817): pid=5190 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:48.378000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.1.164:22-4.153.228.146:43310 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:48.386976 systemd[1]: session-17.scope: Deactivated successfully. Jan 15 02:03:48.391190 systemd-logind[1687]: Removed session 17. Jan 15 02:03:48.473949 systemd[1]: Started sshd@17-10.0.1.164:22-4.153.228.146:43316.service - OpenSSH per-connection server daemon (4.153.228.146:43316). Jan 15 02:03:48.473000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.1.164:22-4.153.228.146:43316 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:49.018000 audit[5204]: USER_ACCT pid=5204 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:49.019937 sshd[5204]: Accepted publickey for core from 4.153.228.146 port 43316 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:03:49.020000 audit[5204]: CRED_ACQ pid=5204 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:49.020000 audit[5204]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc68dbcc30 a2=3 a3=0 items=0 ppid=1 pid=5204 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:49.020000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:49.021981 sshd-session[5204]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:03:49.027324 systemd-logind[1687]: New session 18 of user core. Jan 15 02:03:49.037246 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 15 02:03:49.040000 audit[5204]: USER_START pid=5204 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:49.043000 audit[5207]: CRED_ACQ pid=5207 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:49.709196 sshd[5207]: Connection closed by 4.153.228.146 port 43316 Jan 15 02:03:49.708856 sshd-session[5204]: pam_unix(sshd:session): session closed for user core Jan 15 02:03:49.709000 audit[5204]: USER_END pid=5204 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:49.709000 audit[5204]: CRED_DISP pid=5204 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:49.712727 systemd[1]: sshd@17-10.0.1.164:22-4.153.228.146:43316.service: Deactivated successfully. Jan 15 02:03:49.711000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.1.164:22-4.153.228.146:43316 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:49.713084 systemd-logind[1687]: Session 18 logged out. Waiting for processes to exit. Jan 15 02:03:49.714974 systemd[1]: session-18.scope: Deactivated successfully. Jan 15 02:03:49.716776 systemd-logind[1687]: Removed session 18. Jan 15 02:03:49.738393 kubelet[2935]: E0115 02:03:49.738298 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" podUID="768894fc-e95b-49e5-9a90-1487b94ce02a" Jan 15 02:03:49.816000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.1.164:22-4.153.228.146:43320 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:49.817365 systemd[1]: Started sshd@18-10.0.1.164:22-4.153.228.146:43320.service - OpenSSH per-connection server daemon (4.153.228.146:43320). Jan 15 02:03:50.369000 audit[5217]: USER_ACCT pid=5217 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:50.371479 sshd[5217]: Accepted publickey for core from 4.153.228.146 port 43320 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:03:50.371000 audit[5217]: CRED_ACQ pid=5217 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:50.371000 audit[5217]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe2bdde720 a2=3 a3=0 items=0 ppid=1 pid=5217 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:50.371000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:50.373917 sshd-session[5217]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:03:50.391004 systemd-logind[1687]: New session 19 of user core. Jan 15 02:03:50.402567 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 15 02:03:50.412000 audit[5217]: USER_START pid=5217 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:50.417000 audit[5220]: CRED_ACQ pid=5220 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:51.362000 audit[5239]: NETFILTER_CFG table=filter:138 family=2 entries=26 op=nft_register_rule pid=5239 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:03:51.362000 audit[5239]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7fff95365e20 a2=0 a3=7fff95365e0c items=0 ppid=3037 pid=5239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:51.362000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:03:51.367000 audit[5239]: NETFILTER_CFG table=nat:139 family=2 entries=20 op=nft_register_rule pid=5239 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:03:51.367000 audit[5239]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7fff95365e20 a2=0 a3=0 items=0 ppid=3037 pid=5239 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:51.367000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:03:51.385000 audit[5241]: NETFILTER_CFG table=filter:140 family=2 entries=38 op=nft_register_rule pid=5241 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:03:51.385000 audit[5241]: SYSCALL arch=c000003e syscall=46 success=yes exit=14176 a0=3 a1=7ffc5da3c370 a2=0 a3=7ffc5da3c35c items=0 ppid=3037 pid=5241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:51.385000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:03:51.393000 audit[5241]: NETFILTER_CFG table=nat:141 family=2 entries=20 op=nft_register_rule pid=5241 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:03:51.393000 audit[5241]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffc5da3c370 a2=0 a3=0 items=0 ppid=3037 pid=5241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:51.393000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:03:51.466264 sshd[5220]: Connection closed by 4.153.228.146 port 43320 Jan 15 02:03:51.466679 sshd-session[5217]: pam_unix(sshd:session): session closed for user core Jan 15 02:03:51.467000 audit[5217]: USER_END pid=5217 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:51.467000 audit[5217]: CRED_DISP pid=5217 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:51.471643 systemd[1]: sshd@18-10.0.1.164:22-4.153.228.146:43320.service: Deactivated successfully. Jan 15 02:03:51.470000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.1.164:22-4.153.228.146:43320 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:51.473876 systemd[1]: session-19.scope: Deactivated successfully. Jan 15 02:03:51.475552 systemd-logind[1687]: Session 19 logged out. Waiting for processes to exit. Jan 15 02:03:51.476794 systemd-logind[1687]: Removed session 19. Jan 15 02:03:51.579479 systemd[1]: Started sshd@19-10.0.1.164:22-4.153.228.146:43334.service - OpenSSH per-connection server daemon (4.153.228.146:43334). Jan 15 02:03:51.578000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.1.164:22-4.153.228.146:43334 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:51.742126 kubelet[2935]: E0115 02:03:51.742005 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" podUID="85d7699e-6a0a-4fd1-96b0-9365b90d23ad" Jan 15 02:03:52.138000 audit[5246]: USER_ACCT pid=5246 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:52.139764 sshd[5246]: Accepted publickey for core from 4.153.228.146 port 43334 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:03:52.140000 audit[5246]: CRED_ACQ pid=5246 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:52.140000 audit[5246]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffdd9e68170 a2=3 a3=0 items=0 ppid=1 pid=5246 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:52.140000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:52.141929 sshd-session[5246]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:03:52.147843 systemd-logind[1687]: New session 20 of user core. Jan 15 02:03:52.153310 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 15 02:03:52.156000 audit[5246]: USER_START pid=5246 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:52.158000 audit[5249]: CRED_ACQ pid=5249 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:52.713797 sshd[5249]: Connection closed by 4.153.228.146 port 43334 Jan 15 02:03:52.714600 sshd-session[5246]: pam_unix(sshd:session): session closed for user core Jan 15 02:03:52.715000 audit[5246]: USER_END pid=5246 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:52.718886 kernel: kauditd_printk_skb: 43 callbacks suppressed Jan 15 02:03:52.718954 kernel: audit: type=1106 audit(1768442632.715:847): pid=5246 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:52.715000 audit[5246]: CRED_DISP pid=5246 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:52.724058 kernel: audit: type=1104 audit(1768442632.715:848): pid=5246 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:52.724345 systemd[1]: sshd@19-10.0.1.164:22-4.153.228.146:43334.service: Deactivated successfully. Jan 15 02:03:52.724000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.1.164:22-4.153.228.146:43334 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:52.728952 kernel: audit: type=1131 audit(1768442632.724:849): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.1.164:22-4.153.228.146:43334 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:52.728719 systemd[1]: session-20.scope: Deactivated successfully. Jan 15 02:03:52.730177 systemd-logind[1687]: Session 20 logged out. Waiting for processes to exit. Jan 15 02:03:52.732173 systemd-logind[1687]: Removed session 20. Jan 15 02:03:52.817000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.1.164:22-4.153.228.146:43348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:52.818769 systemd[1]: Started sshd@20-10.0.1.164:22-4.153.228.146:43348.service - OpenSSH per-connection server daemon (4.153.228.146:43348). Jan 15 02:03:52.823188 kernel: audit: type=1130 audit(1768442632.817:850): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.1.164:22-4.153.228.146:43348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:53.350000 audit[5258]: USER_ACCT pid=5258 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:53.356900 sshd-session[5258]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:03:53.360542 sshd[5258]: Accepted publickey for core from 4.153.228.146 port 43348 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:03:53.368201 kernel: audit: type=1101 audit(1768442633.350:851): pid=5258 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:53.351000 audit[5258]: CRED_ACQ pid=5258 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:53.381353 kernel: audit: type=1103 audit(1768442633.351:852): pid=5258 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:53.381501 kernel: audit: type=1006 audit(1768442633.351:853): pid=5258 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Jan 15 02:03:53.389298 kernel: audit: type=1300 audit(1768442633.351:853): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd44272e00 a2=3 a3=0 items=0 ppid=1 pid=5258 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:53.351000 audit[5258]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd44272e00 a2=3 a3=0 items=0 ppid=1 pid=5258 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:53.351000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:53.401545 kernel: audit: type=1327 audit(1768442633.351:853): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:53.408620 systemd-logind[1687]: New session 21 of user core. Jan 15 02:03:53.411478 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 15 02:03:53.417000 audit[5258]: USER_START pid=5258 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:53.424000 audit[5261]: CRED_ACQ pid=5261 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:53.431214 kernel: audit: type=1105 audit(1768442633.417:854): pid=5258 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:53.761977 sshd[5261]: Connection closed by 4.153.228.146 port 43348 Jan 15 02:03:53.763880 sshd-session[5258]: pam_unix(sshd:session): session closed for user core Jan 15 02:03:53.763000 audit[5258]: USER_END pid=5258 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:53.763000 audit[5258]: CRED_DISP pid=5258 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:53.766000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.1.164:22-4.153.228.146:43348 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:53.767007 systemd[1]: sshd@20-10.0.1.164:22-4.153.228.146:43348.service: Deactivated successfully. Jan 15 02:03:53.769001 systemd[1]: session-21.scope: Deactivated successfully. Jan 15 02:03:53.770229 systemd-logind[1687]: Session 21 logged out. Waiting for processes to exit. Jan 15 02:03:53.771298 systemd-logind[1687]: Removed session 21. Jan 15 02:03:54.739569 kubelet[2935]: E0115 02:03:54.739402 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cd9b9db69-tblzt" podUID="0839e1a2-38f7-4739-be41-8f605565d9d2" Jan 15 02:03:55.739352 kubelet[2935]: E0115 02:03:55.739252 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:03:58.260000 audit[5273]: NETFILTER_CFG table=filter:142 family=2 entries=26 op=nft_register_rule pid=5273 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:03:58.262609 kernel: kauditd_printk_skb: 4 callbacks suppressed Jan 15 02:03:58.262687 kernel: audit: type=1325 audit(1768442638.260:859): table=filter:142 family=2 entries=26 op=nft_register_rule pid=5273 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:03:58.267252 kernel: audit: type=1300 audit(1768442638.260:859): arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffc012a24e0 a2=0 a3=7ffc012a24cc items=0 ppid=3037 pid=5273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:58.260000 audit[5273]: SYSCALL arch=c000003e syscall=46 success=yes exit=5248 a0=3 a1=7ffc012a24e0 a2=0 a3=7ffc012a24cc items=0 ppid=3037 pid=5273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:58.260000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:03:58.275669 kernel: audit: type=1327 audit(1768442638.260:859): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:03:58.281000 audit[5273]: NETFILTER_CFG table=nat:143 family=2 entries=104 op=nft_register_chain pid=5273 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:03:58.281000 audit[5273]: SYSCALL arch=c000003e syscall=46 success=yes exit=48684 a0=3 a1=7ffc012a24e0 a2=0 a3=7ffc012a24cc items=0 ppid=3037 pid=5273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:58.289730 kernel: audit: type=1325 audit(1768442638.281:860): table=nat:143 family=2 entries=104 op=nft_register_chain pid=5273 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 15 02:03:58.289824 kernel: audit: type=1300 audit(1768442638.281:860): arch=c000003e syscall=46 success=yes exit=48684 a0=3 a1=7ffc012a24e0 a2=0 a3=7ffc012a24cc items=0 ppid=3037 pid=5273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:58.281000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:03:58.295195 kernel: audit: type=1327 audit(1768442638.281:860): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 15 02:03:58.874304 systemd[1]: Started sshd@21-10.0.1.164:22-4.153.228.146:37122.service - OpenSSH per-connection server daemon (4.153.228.146:37122). Jan 15 02:03:58.873000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.1.164:22-4.153.228.146:37122 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:58.881218 kernel: audit: type=1130 audit(1768442638.873:861): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.1.164:22-4.153.228.146:37122 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:59.414000 audit[5275]: USER_ACCT pid=5275 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:59.422448 sshd[5275]: Accepted publickey for core from 4.153.228.146 port 37122 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:03:59.428708 sshd-session[5275]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:03:59.431270 kernel: audit: type=1101 audit(1768442639.414:862): pid=5275 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:59.426000 audit[5275]: CRED_ACQ pid=5275 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:59.445213 kernel: audit: type=1103 audit(1768442639.426:863): pid=5275 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:59.456928 kernel: audit: type=1006 audit(1768442639.426:864): pid=5275 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Jan 15 02:03:59.454695 systemd-logind[1687]: New session 22 of user core. Jan 15 02:03:59.426000 audit[5275]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffc7c21e960 a2=3 a3=0 items=0 ppid=1 pid=5275 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:03:59.426000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:03:59.461768 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 15 02:03:59.467000 audit[5275]: USER_START pid=5275 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:59.470000 audit[5278]: CRED_ACQ pid=5278 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:59.740818 kubelet[2935]: E0115 02:03:59.740665 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" podUID="19301f0b-a1cd-4987-b493-85e61d59a457" Jan 15 02:03:59.876207 sshd[5278]: Connection closed by 4.153.228.146 port 37122 Jan 15 02:03:59.876195 sshd-session[5275]: pam_unix(sshd:session): session closed for user core Jan 15 02:03:59.879000 audit[5275]: USER_END pid=5275 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:59.880000 audit[5275]: CRED_DISP pid=5275 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:03:59.886278 systemd[1]: sshd@21-10.0.1.164:22-4.153.228.146:37122.service: Deactivated successfully. Jan 15 02:03:59.885000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.1.164:22-4.153.228.146:37122 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:03:59.893397 systemd[1]: session-22.scope: Deactivated successfully. Jan 15 02:03:59.897816 systemd-logind[1687]: Session 22 logged out. Waiting for processes to exit. Jan 15 02:03:59.904485 systemd-logind[1687]: Removed session 22. Jan 15 02:04:00.740184 kubelet[2935]: E0115 02:04:00.739349 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" podUID="768894fc-e95b-49e5-9a90-1487b94ce02a" Jan 15 02:04:02.739567 kubelet[2935]: E0115 02:04:02.739476 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-5x8dj" podUID="7529b612-ddf2-4fb7-9823-720a3bd71760" Jan 15 02:04:03.740166 kubelet[2935]: E0115 02:04:03.739995 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" podUID="85d7699e-6a0a-4fd1-96b0-9365b90d23ad" Jan 15 02:04:04.986395 systemd[1]: Started sshd@22-10.0.1.164:22-4.153.228.146:42840.service - OpenSSH per-connection server daemon (4.153.228.146:42840). Jan 15 02:04:04.985000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.1.164:22-4.153.228.146:42840 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:04:04.987454 kernel: kauditd_printk_skb: 7 callbacks suppressed Jan 15 02:04:04.987517 kernel: audit: type=1130 audit(1768442644.985:870): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.1.164:22-4.153.228.146:42840 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:04:05.519000 audit[5289]: USER_ACCT pid=5289 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:05.521431 sshd[5289]: Accepted publickey for core from 4.153.228.146 port 42840 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:04:05.522854 sshd-session[5289]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:04:05.526213 kernel: audit: type=1101 audit(1768442645.519:871): pid=5289 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:05.519000 audit[5289]: CRED_ACQ pid=5289 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:05.532204 kernel: audit: type=1103 audit(1768442645.519:872): pid=5289 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:05.532256 kernel: audit: type=1006 audit(1768442645.519:873): pid=5289 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Jan 15 02:04:05.519000 audit[5289]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff45b2f7e0 a2=3 a3=0 items=0 ppid=1 pid=5289 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:04:05.536170 kernel: audit: type=1300 audit(1768442645.519:873): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7fff45b2f7e0 a2=3 a3=0 items=0 ppid=1 pid=5289 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:04:05.519000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:04:05.538647 systemd-logind[1687]: New session 23 of user core. Jan 15 02:04:05.539535 kernel: audit: type=1327 audit(1768442645.519:873): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:04:05.541345 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 15 02:04:05.544000 audit[5289]: USER_START pid=5289 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:05.551179 kernel: audit: type=1105 audit(1768442645.544:874): pid=5289 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:05.550000 audit[5292]: CRED_ACQ pid=5292 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:05.556189 kernel: audit: type=1103 audit(1768442645.550:875): pid=5292 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:05.939939 sshd[5292]: Connection closed by 4.153.228.146 port 42840 Jan 15 02:04:05.941335 sshd-session[5289]: pam_unix(sshd:session): session closed for user core Jan 15 02:04:05.942000 audit[5289]: USER_END pid=5289 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:05.947079 systemd[1]: sshd@22-10.0.1.164:22-4.153.228.146:42840.service: Deactivated successfully. Jan 15 02:04:05.948540 kernel: audit: type=1106 audit(1768442645.942:876): pid=5289 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:05.948605 kernel: audit: type=1104 audit(1768442645.942:877): pid=5289 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:05.942000 audit[5289]: CRED_DISP pid=5289 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:05.949485 systemd[1]: session-23.scope: Deactivated successfully. Jan 15 02:04:05.951390 systemd-logind[1687]: Session 23 logged out. Waiting for processes to exit. Jan 15 02:04:05.944000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.1.164:22-4.153.228.146:42840 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:04:05.952932 systemd-logind[1687]: Removed session 23. Jan 15 02:04:06.739592 kubelet[2935]: E0115 02:04:06.739448 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:04:06.740639 kubelet[2935]: E0115 02:04:06.739818 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cd9b9db69-tblzt" podUID="0839e1a2-38f7-4739-be41-8f605565d9d2" Jan 15 02:04:11.076843 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 15 02:04:11.077052 kernel: audit: type=1130 audit(1768442651.060:879): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.1.164:22-4.153.228.146:42852 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:04:11.060000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.1.164:22-4.153.228.146:42852 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:04:11.061441 systemd[1]: Started sshd@23-10.0.1.164:22-4.153.228.146:42852.service - OpenSSH per-connection server daemon (4.153.228.146:42852). Jan 15 02:04:11.651000 audit[5329]: USER_ACCT pid=5329 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:11.656920 sshd[5329]: Accepted publickey for core from 4.153.228.146 port 42852 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:04:11.657175 kernel: audit: type=1101 audit(1768442651.651:880): pid=5329 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:11.657000 audit[5329]: CRED_ACQ pid=5329 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:11.658544 sshd-session[5329]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:04:11.662259 kernel: audit: type=1103 audit(1768442651.657:881): pid=5329 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:11.671989 kernel: audit: type=1006 audit(1768442651.657:882): pid=5329 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Jan 15 02:04:11.672038 kernel: audit: type=1300 audit(1768442651.657:882): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffea7266550 a2=3 a3=0 items=0 ppid=1 pid=5329 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:04:11.657000 audit[5329]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffea7266550 a2=3 a3=0 items=0 ppid=1 pid=5329 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:04:11.675280 kernel: audit: type=1327 audit(1768442651.657:882): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:04:11.657000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:04:11.675415 systemd-logind[1687]: New session 24 of user core. Jan 15 02:04:11.681857 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 15 02:04:11.683000 audit[5329]: USER_START pid=5329 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:11.690259 kernel: audit: type=1105 audit(1768442651.683:883): pid=5329 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:11.694719 kernel: audit: type=1103 audit(1768442651.689:884): pid=5332 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:11.689000 audit[5332]: CRED_ACQ pid=5332 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:12.044998 sshd[5332]: Connection closed by 4.153.228.146 port 42852 Jan 15 02:04:12.045986 sshd-session[5329]: pam_unix(sshd:session): session closed for user core Jan 15 02:04:12.047000 audit[5329]: USER_END pid=5329 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:12.055249 systemd[1]: sshd@23-10.0.1.164:22-4.153.228.146:42852.service: Deactivated successfully. Jan 15 02:04:12.057232 kernel: audit: type=1106 audit(1768442652.047:885): pid=5329 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:12.048000 audit[5329]: CRED_DISP pid=5329 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:12.061093 systemd[1]: session-24.scope: Deactivated successfully. Jan 15 02:04:12.063198 kernel: audit: type=1104 audit(1768442652.048:886): pid=5329 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:12.052000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.1.164:22-4.153.228.146:42852 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:04:12.065742 systemd-logind[1687]: Session 24 logged out. Waiting for processes to exit. Jan 15 02:04:12.069082 systemd-logind[1687]: Removed session 24. Jan 15 02:04:12.739193 kubelet[2935]: E0115 02:04:12.738894 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" podUID="768894fc-e95b-49e5-9a90-1487b94ce02a" Jan 15 02:04:12.740414 kubelet[2935]: E0115 02:04:12.740341 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" podUID="19301f0b-a1cd-4987-b493-85e61d59a457" Jan 15 02:04:13.741839 kubelet[2935]: E0115 02:04:13.740263 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-5x8dj" podUID="7529b612-ddf2-4fb7-9823-720a3bd71760" Jan 15 02:04:16.739005 kubelet[2935]: E0115 02:04:16.737952 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" podUID="85d7699e-6a0a-4fd1-96b0-9365b90d23ad" Jan 15 02:04:17.165797 systemd[1]: Started sshd@24-10.0.1.164:22-4.153.228.146:55814.service - OpenSSH per-connection server daemon (4.153.228.146:55814). Jan 15 02:04:17.180948 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 15 02:04:17.181046 kernel: audit: type=1130 audit(1768442657.167:888): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.1.164:22-4.153.228.146:55814 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:04:17.167000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.1.164:22-4.153.228.146:55814 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:04:17.775000 audit[5343]: USER_ACCT pid=5343 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:17.786114 sshd[5343]: Accepted publickey for core from 4.153.228.146 port 55814 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:04:17.792895 kernel: audit: type=1101 audit(1768442657.775:889): pid=5343 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:17.793041 kernel: audit: type=1103 audit(1768442657.787:890): pid=5343 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:17.787000 audit[5343]: CRED_ACQ pid=5343 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:17.791469 sshd-session[5343]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:04:17.804429 kernel: audit: type=1006 audit(1768442657.787:891): pid=5343 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Jan 15 02:04:17.808539 kernel: audit: type=1300 audit(1768442657.787:891): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffccb0251e0 a2=3 a3=0 items=0 ppid=1 pid=5343 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:04:17.787000 audit[5343]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffccb0251e0 a2=3 a3=0 items=0 ppid=1 pid=5343 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:04:17.787000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:04:17.824191 kernel: audit: type=1327 audit(1768442657.787:891): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:04:17.830275 systemd-logind[1687]: New session 25 of user core. Jan 15 02:04:17.835378 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 15 02:04:17.841000 audit[5343]: USER_START pid=5343 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:17.848212 kernel: audit: type=1105 audit(1768442657.841:892): pid=5343 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:17.851000 audit[5346]: CRED_ACQ pid=5346 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:17.856238 kernel: audit: type=1103 audit(1768442657.851:893): pid=5346 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:18.153668 sshd[5346]: Connection closed by 4.153.228.146 port 55814 Jan 15 02:04:18.154371 sshd-session[5343]: pam_unix(sshd:session): session closed for user core Jan 15 02:04:18.160225 kernel: audit: type=1106 audit(1768442658.155:894): pid=5343 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:18.155000 audit[5343]: USER_END pid=5343 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:18.155000 audit[5343]: CRED_DISP pid=5343 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:18.162129 systemd-logind[1687]: Session 25 logged out. Waiting for processes to exit. Jan 15 02:04:18.164173 kernel: audit: type=1104 audit(1768442658.155:895): pid=5343 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:18.163043 systemd[1]: sshd@24-10.0.1.164:22-4.153.228.146:55814.service: Deactivated successfully. Jan 15 02:04:18.163000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.1.164:22-4.153.228.146:55814 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:04:18.165841 systemd[1]: session-25.scope: Deactivated successfully. Jan 15 02:04:18.170744 systemd-logind[1687]: Removed session 25. Jan 15 02:04:19.739129 kubelet[2935]: E0115 02:04:19.739080 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:04:20.742001 kubelet[2935]: E0115 02:04:20.741504 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cd9b9db69-tblzt" podUID="0839e1a2-38f7-4739-be41-8f605565d9d2" Jan 15 02:04:23.264524 systemd[1]: Started sshd@25-10.0.1.164:22-4.153.228.146:55822.service - OpenSSH per-connection server daemon (4.153.228.146:55822). Jan 15 02:04:23.270096 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 15 02:04:23.270210 kernel: audit: type=1130 audit(1768442663.263:897): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.1.164:22-4.153.228.146:55822 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:04:23.263000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.1.164:22-4.153.228.146:55822 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:04:23.788396 sshd[5368]: Accepted publickey for core from 4.153.228.146 port 55822 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:04:23.787000 audit[5368]: USER_ACCT pid=5368 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:23.792447 sshd-session[5368]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:04:23.793167 kernel: audit: type=1101 audit(1768442663.787:898): pid=5368 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:23.790000 audit[5368]: CRED_ACQ pid=5368 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:23.798176 kernel: audit: type=1103 audit(1768442663.790:899): pid=5368 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:23.790000 audit[5368]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe46cda5b0 a2=3 a3=0 items=0 ppid=1 pid=5368 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:04:23.803612 kernel: audit: type=1006 audit(1768442663.790:900): pid=5368 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Jan 15 02:04:23.803664 kernel: audit: type=1300 audit(1768442663.790:900): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffe46cda5b0 a2=3 a3=0 items=0 ppid=1 pid=5368 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:04:23.806716 kernel: audit: type=1327 audit(1768442663.790:900): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:04:23.790000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:04:23.804380 systemd-logind[1687]: New session 26 of user core. Jan 15 02:04:23.810348 systemd[1]: Started session-26.scope - Session 26 of User core. Jan 15 02:04:23.814000 audit[5368]: USER_START pid=5368 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:23.820169 kernel: audit: type=1105 audit(1768442663.814:901): pid=5368 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:23.819000 audit[5371]: CRED_ACQ pid=5371 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:23.825169 kernel: audit: type=1103 audit(1768442663.819:902): pid=5371 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:24.232129 sshd[5371]: Connection closed by 4.153.228.146 port 55822 Jan 15 02:04:24.232950 sshd-session[5368]: pam_unix(sshd:session): session closed for user core Jan 15 02:04:24.234000 audit[5368]: USER_END pid=5368 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:24.245895 systemd[1]: sshd@25-10.0.1.164:22-4.153.228.146:55822.service: Deactivated successfully. Jan 15 02:04:24.251668 kernel: audit: type=1106 audit(1768442664.234:903): pid=5368 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:24.251763 kernel: audit: type=1104 audit(1768442664.236:904): pid=5368 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:24.236000 audit[5368]: CRED_DISP pid=5368 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:24.248853 systemd[1]: session-26.scope: Deactivated successfully. Jan 15 02:04:24.245000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.1.164:22-4.153.228.146:55822 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:04:24.254950 systemd-logind[1687]: Session 26 logged out. Waiting for processes to exit. Jan 15 02:04:24.256695 systemd-logind[1687]: Removed session 26. Jan 15 02:04:24.738134 kubelet[2935]: E0115 02:04:24.738100 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-5x8dj" podUID="7529b612-ddf2-4fb7-9823-720a3bd71760" Jan 15 02:04:25.739793 kubelet[2935]: E0115 02:04:25.739081 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" podUID="19301f0b-a1cd-4987-b493-85e61d59a457" Jan 15 02:04:27.738080 kubelet[2935]: E0115 02:04:27.738044 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" podUID="768894fc-e95b-49e5-9a90-1487b94ce02a" Jan 15 02:04:29.342394 systemd[1]: Started sshd@26-10.0.1.164:22-4.153.228.146:39428.service - OpenSSH per-connection server daemon (4.153.228.146:39428). Jan 15 02:04:29.341000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.1.164:22-4.153.228.146:39428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:04:29.343821 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 15 02:04:29.343876 kernel: audit: type=1130 audit(1768442669.341:906): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.1.164:22-4.153.228.146:39428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:04:29.893000 audit[5382]: USER_ACCT pid=5382 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:29.895310 sshd[5382]: Accepted publickey for core from 4.153.228.146 port 39428 ssh2: RSA SHA256:JommcX/EFKwAoLkD26GQNXh7Epzh5SgFoQyOyMeE5JA Jan 15 02:04:29.897760 sshd-session[5382]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 15 02:04:29.896000 audit[5382]: CRED_ACQ pid=5382 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:29.901194 kernel: audit: type=1101 audit(1768442669.893:907): pid=5382 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:29.901236 kernel: audit: type=1103 audit(1768442669.896:908): pid=5382 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:29.907356 systemd-logind[1687]: New session 27 of user core. Jan 15 02:04:29.896000 audit[5382]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd05346c90 a2=3 a3=0 items=0 ppid=1 pid=5382 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:04:29.912480 kernel: audit: type=1006 audit(1768442669.896:909): pid=5382 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Jan 15 02:04:29.912618 kernel: audit: type=1300 audit(1768442669.896:909): arch=c000003e syscall=1 success=yes exit=3 a0=8 a1=7ffd05346c90 a2=3 a3=0 items=0 ppid=1 pid=5382 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 15 02:04:29.896000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:04:29.916086 kernel: audit: type=1327 audit(1768442669.896:909): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 15 02:04:29.916298 systemd[1]: Started session-27.scope - Session 27 of User core. Jan 15 02:04:29.918000 audit[5382]: USER_START pid=5382 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:29.922000 audit[5385]: CRED_ACQ pid=5385 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:29.925589 kernel: audit: type=1105 audit(1768442669.918:910): pid=5382 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:29.925621 kernel: audit: type=1103 audit(1768442669.922:911): pid=5385 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:30.340317 sshd[5385]: Connection closed by 4.153.228.146 port 39428 Jan 15 02:04:30.342765 sshd-session[5382]: pam_unix(sshd:session): session closed for user core Jan 15 02:04:30.345000 audit[5382]: USER_END pid=5382 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:30.357240 systemd-logind[1687]: Session 27 logged out. Waiting for processes to exit. Jan 15 02:04:30.358900 systemd[1]: sshd@26-10.0.1.164:22-4.153.228.146:39428.service: Deactivated successfully. Jan 15 02:04:30.364486 kernel: audit: type=1106 audit(1768442670.345:912): pid=5382 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:30.364618 kernel: audit: type=1104 audit(1768442670.346:913): pid=5382 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:30.346000 audit[5382]: CRED_DISP pid=5382 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 15 02:04:30.368786 systemd[1]: session-27.scope: Deactivated successfully. Jan 15 02:04:30.359000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.1.164:22-4.153.228.146:39428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 15 02:04:30.379392 systemd-logind[1687]: Removed session 27. Jan 15 02:04:30.742514 containerd[1710]: time="2026-01-15T02:04:30.741680693Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 15 02:04:30.743257 kubelet[2935]: E0115 02:04:30.741966 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4" Jan 15 02:04:31.102173 containerd[1710]: time="2026-01-15T02:04:31.101679896Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:04:31.103885 containerd[1710]: time="2026-01-15T02:04:31.103849780Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 15 02:04:31.104092 containerd[1710]: time="2026-01-15T02:04:31.103914353Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 15 02:04:31.104718 kubelet[2935]: E0115 02:04:31.104010 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 02:04:31.104718 kubelet[2935]: E0115 02:04:31.104046 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 15 02:04:31.104718 kubelet[2935]: E0115 02:04:31.104628 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2xk5t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7b978c6cc9-jnxw2_calico-system(85d7699e-6a0a-4fd1-96b0-9365b90d23ad): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 15 02:04:31.106632 kubelet[2935]: E0115 02:04:31.106564 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b978c6cc9-jnxw2" podUID="85d7699e-6a0a-4fd1-96b0-9365b90d23ad" Jan 15 02:04:32.737986 containerd[1710]: time="2026-01-15T02:04:32.737951351Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 15 02:04:33.080051 containerd[1710]: time="2026-01-15T02:04:33.080000450Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:04:33.082576 containerd[1710]: time="2026-01-15T02:04:33.082474435Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 15 02:04:33.082576 containerd[1710]: time="2026-01-15T02:04:33.082478214Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 15 02:04:33.082793 kubelet[2935]: E0115 02:04:33.082683 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 02:04:33.083239 kubelet[2935]: E0115 02:04:33.083179 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 15 02:04:33.083392 kubelet[2935]: E0115 02:04:33.083288 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:66ec0441fba04fd9bc4d56215e73797e,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-swvrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cd9b9db69-tblzt_calico-system(0839e1a2-38f7-4739-be41-8f605565d9d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 15 02:04:33.085493 containerd[1710]: time="2026-01-15T02:04:33.085436590Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 15 02:04:33.425333 containerd[1710]: time="2026-01-15T02:04:33.424560783Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:04:33.426902 containerd[1710]: time="2026-01-15T02:04:33.426712605Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 15 02:04:33.426902 containerd[1710]: time="2026-01-15T02:04:33.426786005Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 15 02:04:33.427469 kubelet[2935]: E0115 02:04:33.427362 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 02:04:33.428146 kubelet[2935]: E0115 02:04:33.427476 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 15 02:04:33.428146 kubelet[2935]: E0115 02:04:33.427693 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-swvrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5cd9b9db69-tblzt_calico-system(0839e1a2-38f7-4739-be41-8f605565d9d2): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 15 02:04:33.429774 kubelet[2935]: E0115 02:04:33.429712 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5cd9b9db69-tblzt" podUID="0839e1a2-38f7-4739-be41-8f605565d9d2" Jan 15 02:04:35.739778 containerd[1710]: time="2026-01-15T02:04:35.739657684Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 15 02:04:36.261334 containerd[1710]: time="2026-01-15T02:04:36.261289803Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:04:36.264270 containerd[1710]: time="2026-01-15T02:04:36.264182169Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 15 02:04:36.264270 containerd[1710]: time="2026-01-15T02:04:36.264251274Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 15 02:04:36.264538 kubelet[2935]: E0115 02:04:36.264498 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 02:04:36.264865 kubelet[2935]: E0115 02:04:36.264550 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 15 02:04:36.264865 kubelet[2935]: E0115 02:04:36.264672 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hkwh7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-5x8dj_calico-system(7529b612-ddf2-4fb7-9823-720a3bd71760): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 15 02:04:36.266673 kubelet[2935]: E0115 02:04:36.266621 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-5x8dj" podUID="7529b612-ddf2-4fb7-9823-720a3bd71760" Jan 15 02:04:39.744709 containerd[1710]: time="2026-01-15T02:04:39.743568751Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 02:04:40.084078 containerd[1710]: time="2026-01-15T02:04:40.083973854Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:04:40.086242 containerd[1710]: time="2026-01-15T02:04:40.086137382Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 02:04:40.086445 containerd[1710]: time="2026-01-15T02:04:40.086305002Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 02:04:40.086777 kubelet[2935]: E0115 02:04:40.086699 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:04:40.089754 kubelet[2935]: E0115 02:04:40.086827 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:04:40.089754 kubelet[2935]: E0115 02:04:40.087903 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzplf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5dcd9489f8-x7ppt_calico-apiserver(768894fc-e95b-49e5-9a90-1487b94ce02a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 02:04:40.089754 kubelet[2935]: E0115 02:04:40.089725 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-x7ppt" podUID="768894fc-e95b-49e5-9a90-1487b94ce02a" Jan 15 02:04:40.090217 containerd[1710]: time="2026-01-15T02:04:40.087527698Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 15 02:04:40.424890 containerd[1710]: time="2026-01-15T02:04:40.424790198Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:04:40.427300 containerd[1710]: time="2026-01-15T02:04:40.427265020Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 15 02:04:40.427368 containerd[1710]: time="2026-01-15T02:04:40.427302045Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 15 02:04:40.427520 kubelet[2935]: E0115 02:04:40.427491 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:04:40.427557 kubelet[2935]: E0115 02:04:40.427532 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 15 02:04:40.427669 kubelet[2935]: E0115 02:04:40.427639 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nzlsr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5dcd9489f8-9mlsx_calico-apiserver(19301f0b-a1cd-4987-b493-85e61d59a457): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 15 02:04:40.429023 kubelet[2935]: E0115 02:04:40.428996 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5dcd9489f8-9mlsx" podUID="19301f0b-a1cd-4987-b493-85e61d59a457" Jan 15 02:04:42.738165 containerd[1710]: time="2026-01-15T02:04:42.738122396Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 15 02:04:43.076649 containerd[1710]: time="2026-01-15T02:04:43.076348381Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:04:43.078925 containerd[1710]: time="2026-01-15T02:04:43.078713166Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 15 02:04:43.078925 containerd[1710]: time="2026-01-15T02:04:43.078861066Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 15 02:04:43.079198 kubelet[2935]: E0115 02:04:43.079098 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 02:04:43.079812 kubelet[2935]: E0115 02:04:43.079206 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 15 02:04:43.079812 kubelet[2935]: E0115 02:04:43.079430 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ktgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hdqqp_calico-system(328d556f-d445-4769-97a7-1a5530a232c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 15 02:04:43.082923 containerd[1710]: time="2026-01-15T02:04:43.082852184Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 15 02:04:43.435609 containerd[1710]: time="2026-01-15T02:04:43.435216275Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 15 02:04:43.438505 containerd[1710]: time="2026-01-15T02:04:43.438283084Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 15 02:04:43.438505 containerd[1710]: time="2026-01-15T02:04:43.438452104Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 15 02:04:43.439193 kubelet[2935]: E0115 02:04:43.439070 2935 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 02:04:43.439479 kubelet[2935]: E0115 02:04:43.439443 2935 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 15 02:04:43.454786 kubelet[2935]: E0115 02:04:43.454603 2935 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ktgl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-hdqqp_calico-system(328d556f-d445-4769-97a7-1a5530a232c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 15 02:04:43.456481 kubelet[2935]: E0115 02:04:43.456403 2935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-hdqqp" podUID="328d556f-d445-4769-97a7-1a5530a232c4"