Jan 23 17:25:33.410580 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jan 23 17:25:33.410603 kernel: Linux version 6.12.66-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT Fri Jan 23 15:38:20 -00 2026 Jan 23 17:25:33.410613 kernel: KASLR enabled Jan 23 17:25:33.410619 kernel: efi: EFI v2.7 by EDK II Jan 23 17:25:33.410625 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438351218 Jan 23 17:25:33.410631 kernel: random: crng init done Jan 23 17:25:33.410638 kernel: secureboot: Secure boot disabled Jan 23 17:25:33.410644 kernel: ACPI: Early table checksum verification disabled Jan 23 17:25:33.410650 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Jan 23 17:25:33.410657 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Jan 23 17:25:33.410663 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 17:25:33.410670 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 17:25:33.410676 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 17:25:33.410682 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 17:25:33.410691 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 17:25:33.410697 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 17:25:33.410704 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 17:25:33.410710 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 17:25:33.410716 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 17:25:33.410723 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 23 17:25:33.410729 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Jan 23 17:25:33.410736 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jan 23 17:25:33.410742 kernel: ACPI: Use ACPI SPCR as default console: Yes Jan 23 17:25:33.410750 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Jan 23 17:25:33.410757 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Jan 23 17:25:33.410763 kernel: Zone ranges: Jan 23 17:25:33.410769 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jan 23 17:25:33.410775 kernel: DMA32 empty Jan 23 17:25:33.410782 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Jan 23 17:25:33.410788 kernel: Device empty Jan 23 17:25:33.410794 kernel: Movable zone start for each node Jan 23 17:25:33.410801 kernel: Early memory node ranges Jan 23 17:25:33.410807 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Jan 23 17:25:33.410813 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Jan 23 17:25:33.410820 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Jan 23 17:25:33.410827 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Jan 23 17:25:33.410834 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Jan 23 17:25:33.410840 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Jan 23 17:25:33.410846 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Jan 23 17:25:33.410853 kernel: psci: probing for conduit method from ACPI. Jan 23 17:25:33.410862 kernel: psci: PSCIv1.3 detected in firmware. Jan 23 17:25:33.410871 kernel: psci: Using standard PSCI v0.2 function IDs Jan 23 17:25:33.410878 kernel: psci: Trusted OS migration not required Jan 23 17:25:33.410884 kernel: psci: SMC Calling Convention v1.1 Jan 23 17:25:33.410891 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jan 23 17:25:33.410898 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Jan 23 17:25:33.410905 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Jan 23 17:25:33.410911 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Jan 23 17:25:33.410918 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Jan 23 17:25:33.410926 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jan 23 17:25:33.410933 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jan 23 17:25:33.410940 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Jan 23 17:25:33.410947 kernel: Detected PIPT I-cache on CPU0 Jan 23 17:25:33.410954 kernel: CPU features: detected: GIC system register CPU interface Jan 23 17:25:33.410961 kernel: CPU features: detected: Spectre-v4 Jan 23 17:25:33.410968 kernel: CPU features: detected: Spectre-BHB Jan 23 17:25:33.410975 kernel: CPU features: kernel page table isolation forced ON by KASLR Jan 23 17:25:33.410982 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jan 23 17:25:33.410988 kernel: CPU features: detected: ARM erratum 1418040 Jan 23 17:25:33.410995 kernel: CPU features: detected: SSBS not fully self-synchronizing Jan 23 17:25:33.411003 kernel: alternatives: applying boot alternatives Jan 23 17:25:33.411011 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=35f959b0e84cd72dec35dcaa9fdae098b059a7436b8ff34bc604c87ac6375079 Jan 23 17:25:33.411018 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Jan 23 17:25:33.411025 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Jan 23 17:25:33.411032 kernel: Fallback order for Node 0: 0 Jan 23 17:25:33.411039 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Jan 23 17:25:33.411046 kernel: Policy zone: Normal Jan 23 17:25:33.411052 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 23 17:25:33.411067 kernel: software IO TLB: area num 4. Jan 23 17:25:33.411075 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Jan 23 17:25:33.411084 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Jan 23 17:25:33.411091 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 23 17:25:33.411098 kernel: rcu: RCU event tracing is enabled. Jan 23 17:25:33.411105 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Jan 23 17:25:33.411112 kernel: Trampoline variant of Tasks RCU enabled. Jan 23 17:25:33.411119 kernel: Tracing variant of Tasks RCU enabled. Jan 23 17:25:33.411126 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 23 17:25:33.411133 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Jan 23 17:25:33.411140 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 23 17:25:33.411147 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 23 17:25:33.411154 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jan 23 17:25:33.411162 kernel: GICv3: 256 SPIs implemented Jan 23 17:25:33.411169 kernel: GICv3: 0 Extended SPIs implemented Jan 23 17:25:33.411175 kernel: Root IRQ handler: gic_handle_irq Jan 23 17:25:33.411182 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jan 23 17:25:33.411189 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jan 23 17:25:33.411196 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jan 23 17:25:33.411203 kernel: ITS [mem 0x08080000-0x0809ffff] Jan 23 17:25:33.411209 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Jan 23 17:25:33.411217 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Jan 23 17:25:33.411223 kernel: GICv3: using LPI property table @0x0000000100130000 Jan 23 17:25:33.411230 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Jan 23 17:25:33.411237 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 23 17:25:33.411245 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 23 17:25:33.411252 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jan 23 17:25:33.411259 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jan 23 17:25:33.411266 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jan 23 17:25:33.411273 kernel: arm-pv: using stolen time PV Jan 23 17:25:33.411286 kernel: Console: colour dummy device 80x25 Jan 23 17:25:33.411293 kernel: ACPI: Core revision 20240827 Jan 23 17:25:33.411301 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jan 23 17:25:33.411325 kernel: pid_max: default: 32768 minimum: 301 Jan 23 17:25:33.411333 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 23 17:25:33.411340 kernel: landlock: Up and running. Jan 23 17:25:33.411347 kernel: SELinux: Initializing. Jan 23 17:25:33.411354 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 23 17:25:33.411361 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 23 17:25:33.411368 kernel: rcu: Hierarchical SRCU implementation. Jan 23 17:25:33.411376 kernel: rcu: Max phase no-delay instances is 400. Jan 23 17:25:33.411384 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 23 17:25:33.411392 kernel: Remapping and enabling EFI services. Jan 23 17:25:33.411399 kernel: smp: Bringing up secondary CPUs ... Jan 23 17:25:33.411406 kernel: Detected PIPT I-cache on CPU1 Jan 23 17:25:33.411413 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jan 23 17:25:33.411421 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Jan 23 17:25:33.411428 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 23 17:25:33.411437 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jan 23 17:25:33.411444 kernel: Detected PIPT I-cache on CPU2 Jan 23 17:25:33.411456 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Jan 23 17:25:33.411465 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Jan 23 17:25:33.411473 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 23 17:25:33.411480 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Jan 23 17:25:33.411487 kernel: Detected PIPT I-cache on CPU3 Jan 23 17:25:33.411495 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Jan 23 17:25:33.411504 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Jan 23 17:25:33.411512 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 23 17:25:33.411519 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Jan 23 17:25:33.411527 kernel: smp: Brought up 1 node, 4 CPUs Jan 23 17:25:33.411534 kernel: SMP: Total of 4 processors activated. Jan 23 17:25:33.411542 kernel: CPU: All CPU(s) started at EL1 Jan 23 17:25:33.411550 kernel: CPU features: detected: 32-bit EL0 Support Jan 23 17:25:33.411558 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jan 23 17:25:33.411566 kernel: CPU features: detected: Common not Private translations Jan 23 17:25:33.411573 kernel: CPU features: detected: CRC32 instructions Jan 23 17:25:33.411581 kernel: CPU features: detected: Enhanced Virtualization Traps Jan 23 17:25:33.411588 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jan 23 17:25:33.411596 kernel: CPU features: detected: LSE atomic instructions Jan 23 17:25:33.411603 kernel: CPU features: detected: Privileged Access Never Jan 23 17:25:33.411612 kernel: CPU features: detected: RAS Extension Support Jan 23 17:25:33.411624 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jan 23 17:25:33.411632 kernel: alternatives: applying system-wide alternatives Jan 23 17:25:33.411640 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Jan 23 17:25:33.411648 kernel: Memory: 16324432K/16777216K available (11200K kernel code, 2458K rwdata, 9088K rodata, 12480K init, 1038K bss, 430000K reserved, 16384K cma-reserved) Jan 23 17:25:33.411656 kernel: devtmpfs: initialized Jan 23 17:25:33.411663 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 23 17:25:33.411673 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Jan 23 17:25:33.411680 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jan 23 17:25:33.411688 kernel: 0 pages in range for non-PLT usage Jan 23 17:25:33.411695 kernel: 515168 pages in range for PLT usage Jan 23 17:25:33.411703 kernel: pinctrl core: initialized pinctrl subsystem Jan 23 17:25:33.411710 kernel: SMBIOS 3.0.0 present. Jan 23 17:25:33.411718 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Jan 23 17:25:33.411726 kernel: DMI: Memory slots populated: 1/1 Jan 23 17:25:33.411734 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 23 17:25:33.411741 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Jan 23 17:25:33.411749 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jan 23 17:25:33.411757 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jan 23 17:25:33.411765 kernel: audit: initializing netlink subsys (disabled) Jan 23 17:25:33.411772 kernel: audit: type=2000 audit(0.037:1): state=initialized audit_enabled=0 res=1 Jan 23 17:25:33.411781 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 23 17:25:33.411788 kernel: cpuidle: using governor menu Jan 23 17:25:33.411796 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jan 23 17:25:33.411803 kernel: ASID allocator initialised with 32768 entries Jan 23 17:25:33.411811 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 23 17:25:33.411819 kernel: Serial: AMBA PL011 UART driver Jan 23 17:25:33.411826 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 23 17:25:33.411834 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jan 23 17:25:33.411843 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jan 23 17:25:33.411850 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jan 23 17:25:33.411857 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 23 17:25:33.411865 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jan 23 17:25:33.411872 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jan 23 17:25:33.411879 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jan 23 17:25:33.411889 kernel: ACPI: Added _OSI(Module Device) Jan 23 17:25:33.411898 kernel: ACPI: Added _OSI(Processor Device) Jan 23 17:25:33.411906 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 23 17:25:33.411913 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 23 17:25:33.411921 kernel: ACPI: Interpreter enabled Jan 23 17:25:33.411929 kernel: ACPI: Using GIC for interrupt routing Jan 23 17:25:33.411936 kernel: ACPI: MCFG table detected, 1 entries Jan 23 17:25:33.411943 kernel: ACPI: CPU0 has been hot-added Jan 23 17:25:33.411952 kernel: ACPI: CPU1 has been hot-added Jan 23 17:25:33.411960 kernel: ACPI: CPU2 has been hot-added Jan 23 17:25:33.411967 kernel: ACPI: CPU3 has been hot-added Jan 23 17:25:33.411975 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jan 23 17:25:33.411982 kernel: printk: legacy console [ttyAMA0] enabled Jan 23 17:25:33.411990 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 23 17:25:33.412145 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 23 17:25:33.412234 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 23 17:25:33.412340 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 23 17:25:33.412434 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jan 23 17:25:33.412515 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jan 23 17:25:33.412525 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jan 23 17:25:33.412533 kernel: PCI host bridge to bus 0000:00 Jan 23 17:25:33.412623 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jan 23 17:25:33.412707 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jan 23 17:25:33.412780 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jan 23 17:25:33.412851 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 23 17:25:33.412949 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jan 23 17:25:33.413041 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.413125 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Jan 23 17:25:33.413205 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 23 17:25:33.413284 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Jan 23 17:25:33.413388 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Jan 23 17:25:33.413476 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.413559 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Jan 23 17:25:33.413638 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 23 17:25:33.413717 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Jan 23 17:25:33.413803 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.413881 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Jan 23 17:25:33.413961 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 23 17:25:33.414041 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Jan 23 17:25:33.414119 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Jan 23 17:25:33.414205 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.414299 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Jan 23 17:25:33.414424 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 23 17:25:33.414516 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Jan 23 17:25:33.414608 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.414698 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Jan 23 17:25:33.414780 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 23 17:25:33.414859 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Jan 23 17:25:33.414949 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Jan 23 17:25:33.415041 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.415122 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Jan 23 17:25:33.415201 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 23 17:25:33.415279 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Jan 23 17:25:33.415385 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Jan 23 17:25:33.415476 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.415574 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Jan 23 17:25:33.415659 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 23 17:25:33.415756 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.415859 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Jan 23 17:25:33.415941 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 23 17:25:33.416032 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.416115 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Jan 23 17:25:33.416201 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 23 17:25:33.416288 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.416388 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Jan 23 17:25:33.416476 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 23 17:25:33.416572 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.416663 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Jan 23 17:25:33.416758 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 23 17:25:33.416852 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.416934 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Jan 23 17:25:33.417017 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 23 17:25:33.417105 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.417186 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Jan 23 17:25:33.417273 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 23 17:25:33.417385 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.417466 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Jan 23 17:25:33.417554 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 23 17:25:33.417652 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.417734 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Jan 23 17:25:33.417812 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 23 17:25:33.417897 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.417980 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Jan 23 17:25:33.418064 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 23 17:25:33.418152 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.418237 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Jan 23 17:25:33.418362 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 23 17:25:33.418457 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.418542 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Jan 23 17:25:33.418620 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 23 17:25:33.418699 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Jan 23 17:25:33.418778 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Jan 23 17:25:33.418863 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.418944 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Jan 23 17:25:33.419027 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 23 17:25:33.419116 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Jan 23 17:25:33.419197 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Jan 23 17:25:33.419282 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.419388 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Jan 23 17:25:33.419470 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 23 17:25:33.419551 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Jan 23 17:25:33.419632 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Jan 23 17:25:33.419726 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.419813 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Jan 23 17:25:33.419900 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 23 17:25:33.419980 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Jan 23 17:25:33.420069 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Jan 23 17:25:33.420159 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.420239 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Jan 23 17:25:33.420331 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 23 17:25:33.420414 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Jan 23 17:25:33.420500 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Jan 23 17:25:33.420593 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.420675 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Jan 23 17:25:33.420762 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 23 17:25:33.420842 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Jan 23 17:25:33.420920 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Jan 23 17:25:33.421011 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.421099 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Jan 23 17:25:33.421179 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 23 17:25:33.421268 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Jan 23 17:25:33.421366 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Jan 23 17:25:33.421454 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.421535 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Jan 23 17:25:33.421625 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 23 17:25:33.421707 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Jan 23 17:25:33.421788 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Jan 23 17:25:33.421875 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.421955 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Jan 23 17:25:33.422040 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 23 17:25:33.422140 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Jan 23 17:25:33.422234 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Jan 23 17:25:33.422355 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.422440 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Jan 23 17:25:33.422523 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 23 17:25:33.422602 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Jan 23 17:25:33.422684 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Jan 23 17:25:33.422768 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.422848 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Jan 23 17:25:33.422928 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 23 17:25:33.423010 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Jan 23 17:25:33.423089 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Jan 23 17:25:33.423176 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.423256 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Jan 23 17:25:33.423370 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 23 17:25:33.423458 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Jan 23 17:25:33.423538 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Jan 23 17:25:33.423626 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.423705 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Jan 23 17:25:33.423784 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 23 17:25:33.423863 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Jan 23 17:25:33.423945 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Jan 23 17:25:33.424031 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.424111 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Jan 23 17:25:33.424192 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 23 17:25:33.424272 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Jan 23 17:25:33.424366 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Jan 23 17:25:33.424477 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.424578 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Jan 23 17:25:33.424671 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 23 17:25:33.424751 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Jan 23 17:25:33.424829 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Jan 23 17:25:33.424919 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 23 17:25:33.425001 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Jan 23 17:25:33.425079 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 23 17:25:33.425157 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Jan 23 17:25:33.425235 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Jan 23 17:25:33.425341 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 23 17:25:33.425436 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Jan 23 17:25:33.425540 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jan 23 17:25:33.425633 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 23 17:25:33.425736 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jan 23 17:25:33.425820 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Jan 23 17:25:33.425917 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 23 17:25:33.426002 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Jan 23 17:25:33.426082 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Jan 23 17:25:33.426170 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 23 17:25:33.426251 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Jan 23 17:25:33.426376 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 23 17:25:33.426483 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Jan 23 17:25:33.426573 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Jan 23 17:25:33.426681 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Jan 23 17:25:33.426767 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Jan 23 17:25:33.426860 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Jan 23 17:25:33.426943 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jan 23 17:25:33.427031 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jan 23 17:25:33.427117 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jan 23 17:25:33.427203 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jan 23 17:25:33.427283 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jan 23 17:25:33.427398 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jan 23 17:25:33.427483 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 23 17:25:33.427565 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jan 23 17:25:33.427647 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jan 23 17:25:33.427730 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 23 17:25:33.427809 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jan 23 17:25:33.427891 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jan 23 17:25:33.427974 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 23 17:25:33.428053 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jan 23 17:25:33.428133 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Jan 23 17:25:33.428216 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 23 17:25:33.428297 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jan 23 17:25:33.428396 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jan 23 17:25:33.428479 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 23 17:25:33.428570 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Jan 23 17:25:33.428674 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Jan 23 17:25:33.428792 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 23 17:25:33.428878 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jan 23 17:25:33.428957 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jan 23 17:25:33.429039 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 23 17:25:33.429118 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jan 23 17:25:33.429197 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jan 23 17:25:33.429287 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 23 17:25:33.429391 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Jan 23 17:25:33.429473 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Jan 23 17:25:33.429557 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 23 17:25:33.429636 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 23 17:25:33.429715 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Jan 23 17:25:33.429801 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 23 17:25:33.429880 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Jan 23 17:25:33.429958 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Jan 23 17:25:33.430041 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 23 17:25:33.430120 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Jan 23 17:25:33.430198 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Jan 23 17:25:33.430296 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 23 17:25:33.430445 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Jan 23 17:25:33.430528 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Jan 23 17:25:33.430622 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 23 17:25:33.430707 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Jan 23 17:25:33.430787 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Jan 23 17:25:33.430875 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 23 17:25:33.430961 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Jan 23 17:25:33.431041 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Jan 23 17:25:33.431132 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 23 17:25:33.431213 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Jan 23 17:25:33.431293 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Jan 23 17:25:33.431394 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 23 17:25:33.431476 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Jan 23 17:25:33.431555 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Jan 23 17:25:33.431646 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 23 17:25:33.431732 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Jan 23 17:25:33.431817 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Jan 23 17:25:33.431899 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 23 17:25:33.431978 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Jan 23 17:25:33.432061 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Jan 23 17:25:33.432150 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 23 17:25:33.432233 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Jan 23 17:25:33.432333 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Jan 23 17:25:33.432421 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 23 17:25:33.432501 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Jan 23 17:25:33.432580 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Jan 23 17:25:33.432662 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 23 17:25:33.432745 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Jan 23 17:25:33.432824 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Jan 23 17:25:33.432907 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 23 17:25:33.432987 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Jan 23 17:25:33.433073 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Jan 23 17:25:33.433161 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 23 17:25:33.433241 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Jan 23 17:25:33.433342 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Jan 23 17:25:33.433430 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 23 17:25:33.433509 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Jan 23 17:25:33.433588 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Jan 23 17:25:33.433673 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 23 17:25:33.433752 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Jan 23 17:25:33.433831 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Jan 23 17:25:33.433916 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 23 17:25:33.433996 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Jan 23 17:25:33.434075 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Jan 23 17:25:33.434158 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 23 17:25:33.434245 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Jan 23 17:25:33.434359 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Jan 23 17:25:33.434449 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 23 17:25:33.434529 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Jan 23 17:25:33.434611 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Jan 23 17:25:33.434694 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 23 17:25:33.434774 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Jan 23 17:25:33.434852 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Jan 23 17:25:33.434934 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 23 17:25:33.435013 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Jan 23 17:25:33.435093 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Jan 23 17:25:33.435175 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 23 17:25:33.435254 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Jan 23 17:25:33.435346 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Jan 23 17:25:33.435430 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Jan 23 17:25:33.435511 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Jan 23 17:25:33.435595 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Jan 23 17:25:33.435675 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Jan 23 17:25:33.435756 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Jan 23 17:25:33.435839 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Jan 23 17:25:33.435923 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Jan 23 17:25:33.436010 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Jan 23 17:25:33.436102 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Jan 23 17:25:33.436184 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Jan 23 17:25:33.436268 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Jan 23 17:25:33.436362 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Jan 23 17:25:33.436446 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Jan 23 17:25:33.436540 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Jan 23 17:25:33.436626 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Jan 23 17:25:33.436707 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Jan 23 17:25:33.436789 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Jan 23 17:25:33.436869 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Jan 23 17:25:33.436957 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Jan 23 17:25:33.437038 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Jan 23 17:25:33.437126 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Jan 23 17:25:33.437212 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Jan 23 17:25:33.437296 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Jan 23 17:25:33.437390 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Jan 23 17:25:33.437475 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Jan 23 17:25:33.437576 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Jan 23 17:25:33.437659 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Jan 23 17:25:33.437743 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Jan 23 17:25:33.437825 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Jan 23 17:25:33.437904 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Jan 23 17:25:33.437985 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Jan 23 17:25:33.438063 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Jan 23 17:25:33.438143 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Jan 23 17:25:33.438225 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Jan 23 17:25:33.438325 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Jan 23 17:25:33.438412 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Jan 23 17:25:33.438496 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Jan 23 17:25:33.438582 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Jan 23 17:25:33.438665 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Jan 23 17:25:33.438751 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Jan 23 17:25:33.438839 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Jan 23 17:25:33.438922 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Jan 23 17:25:33.439006 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Jan 23 17:25:33.439085 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Jan 23 17:25:33.439166 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Jan 23 17:25:33.439245 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Jan 23 17:25:33.439342 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Jan 23 17:25:33.439425 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Jan 23 17:25:33.439525 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Jan 23 17:25:33.439606 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Jan 23 17:25:33.439698 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Jan 23 17:25:33.439779 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Jan 23 17:25:33.439864 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Jan 23 17:25:33.439946 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Jan 23 17:25:33.440029 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Jan 23 17:25:33.440111 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Jan 23 17:25:33.440194 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Jan 23 17:25:33.440278 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Jan 23 17:25:33.440371 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Jan 23 17:25:33.440454 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Jan 23 17:25:33.440535 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Jan 23 17:25:33.440614 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Jan 23 17:25:33.440694 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Jan 23 17:25:33.440777 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Jan 23 17:25:33.440860 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Jan 23 17:25:33.440949 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Jan 23 17:25:33.441032 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Jan 23 17:25:33.441110 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Jan 23 17:25:33.441190 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Jan 23 17:25:33.441268 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Jan 23 17:25:33.441366 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Jan 23 17:25:33.441451 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Jan 23 17:25:33.441532 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Jan 23 17:25:33.441611 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Jan 23 17:25:33.441690 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Jan 23 17:25:33.441769 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Jan 23 17:25:33.441849 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Jan 23 17:25:33.441928 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Jan 23 17:25:33.442018 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Jan 23 17:25:33.442102 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Jan 23 17:25:33.442182 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Jan 23 17:25:33.442263 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Jan 23 17:25:33.442374 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Jan 23 17:25:33.442457 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Jan 23 17:25:33.442542 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Jan 23 17:25:33.442622 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Jan 23 17:25:33.442704 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Jan 23 17:25:33.442783 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Jan 23 17:25:33.442865 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Jan 23 17:25:33.442945 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Jan 23 17:25:33.443026 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Jan 23 17:25:33.443115 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Jan 23 17:25:33.443198 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Jan 23 17:25:33.443278 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Jan 23 17:25:33.443372 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Jan 23 17:25:33.443457 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Jan 23 17:25:33.443539 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Jan 23 17:25:33.443623 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.443711 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.443794 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Jan 23 17:25:33.443880 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.443963 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.444044 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Jan 23 17:25:33.444124 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.444207 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.444290 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Jan 23 17:25:33.444386 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.444469 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.444555 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Jan 23 17:25:33.444638 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.444719 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.444817 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Jan 23 17:25:33.444898 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.444979 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.445061 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Jan 23 17:25:33.445150 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.445232 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.445324 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Jan 23 17:25:33.445407 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.445494 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.445585 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Jan 23 17:25:33.445672 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.445760 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.445853 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Jan 23 17:25:33.445937 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.446016 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.446109 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Jan 23 17:25:33.446195 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.446286 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.446411 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Jan 23 17:25:33.446501 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.446581 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.446671 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Jan 23 17:25:33.446765 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.446847 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.446932 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Jan 23 17:25:33.447011 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.447089 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.447170 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Jan 23 17:25:33.447251 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.447352 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.447439 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Jan 23 17:25:33.447520 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.447604 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.447695 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Jan 23 17:25:33.447790 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.447882 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.447970 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Jan 23 17:25:33.448053 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.448142 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.448223 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Jan 23 17:25:33.448635 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Jan 23 17:25:33.448766 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Jan 23 17:25:33.448874 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Jan 23 17:25:33.448960 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Jan 23 17:25:33.449043 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Jan 23 17:25:33.449128 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Jan 23 17:25:33.449211 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Jan 23 17:25:33.449293 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Jan 23 17:25:33.449970 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Jan 23 17:25:33.450062 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Jan 23 17:25:33.450144 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Jan 23 17:25:33.450226 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Jan 23 17:25:33.450348 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Jan 23 17:25:33.450436 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Jan 23 17:25:33.450520 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.450599 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.450698 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.450779 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.450862 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.450942 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.451024 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.451118 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.451203 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.451286 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.451385 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.451467 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.451547 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.451626 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.451708 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.451792 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.451875 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.451957 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.452041 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.452124 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.452206 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.452296 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.452402 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.452484 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.452570 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.452654 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.452736 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.452820 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.452903 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.453003 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.453087 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.453177 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.453266 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.453366 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.453449 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Jan 23 17:25:33.453537 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Jan 23 17:25:33.453625 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Jan 23 17:25:33.453707 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jan 23 17:25:33.453798 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Jan 23 17:25:33.453879 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 23 17:25:33.453960 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Jan 23 17:25:33.454040 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jan 23 17:25:33.454128 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Jan 23 17:25:33.454207 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 23 17:25:33.454313 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Jan 23 17:25:33.454405 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jan 23 17:25:33.454495 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Jan 23 17:25:33.454586 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Jan 23 17:25:33.454667 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 23 17:25:33.454746 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Jan 23 17:25:33.454829 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jan 23 17:25:33.454918 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Jan 23 17:25:33.454999 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 23 17:25:33.455092 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Jan 23 17:25:33.455172 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jan 23 17:25:33.455258 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Jan 23 17:25:33.455364 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Jan 23 17:25:33.455450 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 23 17:25:33.455540 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Jan 23 17:25:33.455621 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jan 23 17:25:33.455712 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Jan 23 17:25:33.455800 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Jan 23 17:25:33.455884 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 23 17:25:33.455963 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Jan 23 17:25:33.456041 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 23 17:25:33.456131 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 23 17:25:33.456213 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Jan 23 17:25:33.456292 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 23 17:25:33.456389 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 23 17:25:33.456477 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Jan 23 17:25:33.456568 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 23 17:25:33.456651 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 23 17:25:33.456729 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Jan 23 17:25:33.456808 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jan 23 17:25:33.456897 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 23 17:25:33.456984 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Jan 23 17:25:33.457064 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Jan 23 17:25:33.457144 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 23 17:25:33.457228 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Jan 23 17:25:33.457321 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Jan 23 17:25:33.457404 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 23 17:25:33.457486 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Jan 23 17:25:33.457568 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Jan 23 17:25:33.457648 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 23 17:25:33.457729 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Jan 23 17:25:33.457812 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Jan 23 17:25:33.457892 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 23 17:25:33.457980 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Jan 23 17:25:33.458060 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 23 17:25:33.458140 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 23 17:25:33.458222 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Jan 23 17:25:33.458331 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 23 17:25:33.458419 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 23 17:25:33.458506 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Jan 23 17:25:33.458597 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 23 17:25:33.458683 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 23 17:25:33.458763 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Jan 23 17:25:33.458841 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Jan 23 17:25:33.458921 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 23 17:25:33.459000 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Jan 23 17:25:33.459085 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Jan 23 17:25:33.459168 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 23 17:25:33.459246 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Jan 23 17:25:33.459344 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Jan 23 17:25:33.459426 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Jan 23 17:25:33.459505 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 23 17:25:33.459584 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Jan 23 17:25:33.459662 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Jan 23 17:25:33.459749 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Jan 23 17:25:33.459831 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 23 17:25:33.459910 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Jan 23 17:25:33.459989 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Jan 23 17:25:33.460067 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Jan 23 17:25:33.460147 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 23 17:25:33.460227 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Jan 23 17:25:33.460316 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Jan 23 17:25:33.460400 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 23 17:25:33.460481 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 23 17:25:33.460560 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Jan 23 17:25:33.460638 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Jan 23 17:25:33.460716 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 23 17:25:33.460798 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 23 17:25:33.460877 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Jan 23 17:25:33.460955 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Jan 23 17:25:33.461034 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 23 17:25:33.461113 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 23 17:25:33.461192 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Jan 23 17:25:33.461270 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Jan 23 17:25:33.461362 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Jan 23 17:25:33.461445 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 23 17:25:33.461524 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Jan 23 17:25:33.461602 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Jan 23 17:25:33.461681 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Jan 23 17:25:33.461761 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 23 17:25:33.461926 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Jan 23 17:25:33.462012 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Jan 23 17:25:33.462091 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Jan 23 17:25:33.462172 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 23 17:25:33.462250 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Jan 23 17:25:33.462386 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Jan 23 17:25:33.462470 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Jan 23 17:25:33.462570 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 23 17:25:33.462651 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Jan 23 17:25:33.462731 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Jan 23 17:25:33.462809 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Jan 23 17:25:33.462890 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 23 17:25:33.462970 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Jan 23 17:25:33.463049 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Jan 23 17:25:33.463131 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 23 17:25:33.463211 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 23 17:25:33.463294 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Jan 23 17:25:33.463391 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Jan 23 17:25:33.463472 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 23 17:25:33.463554 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 23 17:25:33.463636 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Jan 23 17:25:33.463715 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Jan 23 17:25:33.463794 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 23 17:25:33.463876 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 23 17:25:33.463956 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Jan 23 17:25:33.464036 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Jan 23 17:25:33.464116 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Jan 23 17:25:33.464203 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jan 23 17:25:33.464276 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jan 23 17:25:33.464362 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jan 23 17:25:33.464448 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jan 23 17:25:33.464524 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jan 23 17:25:33.464614 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jan 23 17:25:33.464693 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jan 23 17:25:33.464839 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jan 23 17:25:33.464922 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jan 23 17:25:33.465008 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jan 23 17:25:33.465085 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jan 23 17:25:33.465171 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jan 23 17:25:33.465246 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jan 23 17:25:33.465350 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jan 23 17:25:33.465432 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 23 17:25:33.465518 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jan 23 17:25:33.465600 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 23 17:25:33.465696 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jan 23 17:25:33.465793 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 23 17:25:33.465876 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jan 23 17:25:33.465971 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jan 23 17:25:33.466060 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Jan 23 17:25:33.466135 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Jan 23 17:25:33.466222 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Jan 23 17:25:33.466343 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Jan 23 17:25:33.466435 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Jan 23 17:25:33.466516 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Jan 23 17:25:33.466609 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Jan 23 17:25:33.466686 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Jan 23 17:25:33.466767 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Jan 23 17:25:33.466842 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 23 17:25:33.466933 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Jan 23 17:25:33.467021 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 23 17:25:33.467119 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Jan 23 17:25:33.467195 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 23 17:25:33.467277 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Jan 23 17:25:33.467373 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Jan 23 17:25:33.467486 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Jan 23 17:25:33.467572 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Jan 23 17:25:33.467684 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Jan 23 17:25:33.467769 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Jan 23 17:25:33.467852 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Jan 23 17:25:33.467940 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Jan 23 17:25:33.468015 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Jan 23 17:25:33.468095 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Jan 23 17:25:33.468179 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Jan 23 17:25:33.468253 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Jan 23 17:25:33.468357 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Jan 23 17:25:33.468443 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Jan 23 17:25:33.468517 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Jan 23 17:25:33.468591 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 23 17:25:33.468681 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Jan 23 17:25:33.468757 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Jan 23 17:25:33.468834 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 23 17:25:33.468914 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Jan 23 17:25:33.468989 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Jan 23 17:25:33.469074 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 23 17:25:33.469163 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Jan 23 17:25:33.469242 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Jan 23 17:25:33.469353 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Jan 23 17:25:33.469440 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Jan 23 17:25:33.469516 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Jan 23 17:25:33.469591 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Jan 23 17:25:33.469673 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 23 17:25:33.469750 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Jan 23 17:25:33.469833 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Jan 23 17:25:33.469932 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Jan 23 17:25:33.470018 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Jan 23 17:25:33.470107 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Jan 23 17:25:33.470192 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Jan 23 17:25:33.470322 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Jan 23 17:25:33.470408 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Jan 23 17:25:33.470493 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Jan 23 17:25:33.470569 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Jan 23 17:25:33.470643 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 23 17:25:33.470726 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Jan 23 17:25:33.470800 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Jan 23 17:25:33.470873 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 23 17:25:33.470954 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Jan 23 17:25:33.471028 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Jan 23 17:25:33.471102 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 23 17:25:33.471188 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Jan 23 17:25:33.471264 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Jan 23 17:25:33.471354 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Jan 23 17:25:33.471365 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jan 23 17:25:33.471373 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jan 23 17:25:33.471381 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jan 23 17:25:33.471392 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jan 23 17:25:33.471401 kernel: iommu: Default domain type: Translated Jan 23 17:25:33.471409 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jan 23 17:25:33.471417 kernel: efivars: Registered efivars operations Jan 23 17:25:33.471425 kernel: vgaarb: loaded Jan 23 17:25:33.471433 kernel: clocksource: Switched to clocksource arch_sys_counter Jan 23 17:25:33.471441 kernel: VFS: Disk quotas dquot_6.6.0 Jan 23 17:25:33.471451 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 23 17:25:33.471459 kernel: pnp: PnP ACPI init Jan 23 17:25:33.471552 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jan 23 17:25:33.471564 kernel: pnp: PnP ACPI: found 1 devices Jan 23 17:25:33.471572 kernel: NET: Registered PF_INET protocol family Jan 23 17:25:33.471580 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 23 17:25:33.471588 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Jan 23 17:25:33.471599 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 23 17:25:33.471607 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 23 17:25:33.471615 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Jan 23 17:25:33.471623 kernel: TCP: Hash tables configured (established 131072 bind 65536) Jan 23 17:25:33.471631 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 23 17:25:33.471640 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 23 17:25:33.471648 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 23 17:25:33.471739 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jan 23 17:25:33.471751 kernel: PCI: CLS 0 bytes, default 64 Jan 23 17:25:33.471760 kernel: kvm [1]: HYP mode not available Jan 23 17:25:33.471768 kernel: Initialise system trusted keyrings Jan 23 17:25:33.471776 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Jan 23 17:25:33.471784 kernel: Key type asymmetric registered Jan 23 17:25:33.471792 kernel: Asymmetric key parser 'x509' registered Jan 23 17:25:33.471802 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jan 23 17:25:33.471810 kernel: io scheduler mq-deadline registered Jan 23 17:25:33.471818 kernel: io scheduler kyber registered Jan 23 17:25:33.471826 kernel: io scheduler bfq registered Jan 23 17:25:33.471836 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jan 23 17:25:33.471920 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Jan 23 17:25:33.472000 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Jan 23 17:25:33.472090 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.472176 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Jan 23 17:25:33.472260 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Jan 23 17:25:33.472364 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.472448 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Jan 23 17:25:33.472529 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Jan 23 17:25:33.472612 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.472695 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Jan 23 17:25:33.472774 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Jan 23 17:25:33.472852 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.472934 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Jan 23 17:25:33.473031 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Jan 23 17:25:33.473109 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.473194 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Jan 23 17:25:33.473277 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Jan 23 17:25:33.473368 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.473452 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Jan 23 17:25:33.473532 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Jan 23 17:25:33.473610 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.473695 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Jan 23 17:25:33.473774 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Jan 23 17:25:33.473851 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.473863 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jan 23 17:25:33.473942 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Jan 23 17:25:33.474021 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Jan 23 17:25:33.474101 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.474182 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Jan 23 17:25:33.474261 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Jan 23 17:25:33.474381 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.474468 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Jan 23 17:25:33.474549 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Jan 23 17:25:33.474628 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.474715 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Jan 23 17:25:33.474799 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Jan 23 17:25:33.474878 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.474960 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Jan 23 17:25:33.475044 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Jan 23 17:25:33.475131 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.475219 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Jan 23 17:25:33.475299 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Jan 23 17:25:33.475393 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.475486 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Jan 23 17:25:33.475570 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Jan 23 17:25:33.475650 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.475737 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Jan 23 17:25:33.475825 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Jan 23 17:25:33.475914 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.475929 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jan 23 17:25:33.476011 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Jan 23 17:25:33.476092 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Jan 23 17:25:33.476171 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.476256 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Jan 23 17:25:33.476369 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Jan 23 17:25:33.476451 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.476542 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Jan 23 17:25:33.476622 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Jan 23 17:25:33.476709 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.476795 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Jan 23 17:25:33.476874 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Jan 23 17:25:33.476953 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.477051 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Jan 23 17:25:33.477142 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Jan 23 17:25:33.477226 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.477326 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Jan 23 17:25:33.477411 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Jan 23 17:25:33.477490 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.477575 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Jan 23 17:25:33.477657 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Jan 23 17:25:33.477737 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.477831 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Jan 23 17:25:33.477927 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Jan 23 17:25:33.478013 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.478024 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jan 23 17:25:33.478105 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Jan 23 17:25:33.478201 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Jan 23 17:25:33.478318 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.478419 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Jan 23 17:25:33.478502 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Jan 23 17:25:33.478583 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.478668 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Jan 23 17:25:33.478760 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Jan 23 17:25:33.478841 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.478929 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Jan 23 17:25:33.479014 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Jan 23 17:25:33.479100 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.479182 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Jan 23 17:25:33.479264 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Jan 23 17:25:33.479361 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.479450 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Jan 23 17:25:33.479531 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Jan 23 17:25:33.479610 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.479702 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Jan 23 17:25:33.479786 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Jan 23 17:25:33.479864 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.479946 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Jan 23 17:25:33.480029 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Jan 23 17:25:33.480109 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.480191 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Jan 23 17:25:33.480271 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Jan 23 17:25:33.480367 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 23 17:25:33.480378 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jan 23 17:25:33.480389 kernel: ACPI: button: Power Button [PWRB] Jan 23 17:25:33.480474 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Jan 23 17:25:33.480560 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jan 23 17:25:33.480571 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 23 17:25:33.480580 kernel: thunder_xcv, ver 1.0 Jan 23 17:25:33.480588 kernel: thunder_bgx, ver 1.0 Jan 23 17:25:33.480596 kernel: nicpf, ver 1.0 Jan 23 17:25:33.480606 kernel: nicvf, ver 1.0 Jan 23 17:25:33.480703 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jan 23 17:25:33.480783 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-01-23T17:25:32 UTC (1769189132) Jan 23 17:25:33.480793 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 23 17:25:33.480802 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jan 23 17:25:33.480810 kernel: watchdog: NMI not fully supported Jan 23 17:25:33.480821 kernel: watchdog: Hard watchdog permanently disabled Jan 23 17:25:33.480829 kernel: NET: Registered PF_INET6 protocol family Jan 23 17:25:33.480837 kernel: Segment Routing with IPv6 Jan 23 17:25:33.480845 kernel: In-situ OAM (IOAM) with IPv6 Jan 23 17:25:33.480853 kernel: NET: Registered PF_PACKET protocol family Jan 23 17:25:33.480861 kernel: Key type dns_resolver registered Jan 23 17:25:33.480869 kernel: registered taskstats version 1 Jan 23 17:25:33.480878 kernel: Loading compiled-in X.509 certificates Jan 23 17:25:33.480888 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.66-flatcar: 2bef814d3854848add18d21bd2681c3d03c60f56' Jan 23 17:25:33.480896 kernel: Demotion targets for Node 0: null Jan 23 17:25:33.480904 kernel: Key type .fscrypt registered Jan 23 17:25:33.480912 kernel: Key type fscrypt-provisioning registered Jan 23 17:25:33.480920 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 23 17:25:33.480929 kernel: ima: Allocated hash algorithm: sha1 Jan 23 17:25:33.480937 kernel: ima: No architecture policies found Jan 23 17:25:33.480946 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jan 23 17:25:33.480955 kernel: clk: Disabling unused clocks Jan 23 17:25:33.480963 kernel: PM: genpd: Disabling unused power domains Jan 23 17:25:33.480971 kernel: Freeing unused kernel memory: 12480K Jan 23 17:25:33.480979 kernel: Run /init as init process Jan 23 17:25:33.480986 kernel: with arguments: Jan 23 17:25:33.480994 kernel: /init Jan 23 17:25:33.481004 kernel: with environment: Jan 23 17:25:33.481012 kernel: HOME=/ Jan 23 17:25:33.481020 kernel: TERM=linux Jan 23 17:25:33.481027 kernel: ACPI: bus type USB registered Jan 23 17:25:33.481035 kernel: usbcore: registered new interface driver usbfs Jan 23 17:25:33.481044 kernel: usbcore: registered new interface driver hub Jan 23 17:25:33.481052 kernel: usbcore: registered new device driver usb Jan 23 17:25:33.481138 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 23 17:25:33.481224 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 23 17:25:33.481333 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 23 17:25:33.481418 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 23 17:25:33.481501 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 23 17:25:33.481584 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 23 17:25:33.481690 kernel: hub 1-0:1.0: USB hub found Jan 23 17:25:33.481791 kernel: hub 1-0:1.0: 4 ports detected Jan 23 17:25:33.481894 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 23 17:25:33.482003 kernel: hub 2-0:1.0: USB hub found Jan 23 17:25:33.482101 kernel: hub 2-0:1.0: 4 ports detected Jan 23 17:25:33.482200 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Jan 23 17:25:33.482312 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 23 17:25:33.482325 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 23 17:25:33.482334 kernel: GPT:25804799 != 104857599 Jan 23 17:25:33.482343 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 23 17:25:33.482351 kernel: GPT:25804799 != 104857599 Jan 23 17:25:33.482359 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 23 17:25:33.482371 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 23 17:25:33.482379 kernel: SCSI subsystem initialized Jan 23 17:25:33.482388 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 23 17:25:33.482396 kernel: device-mapper: uevent: version 1.0.3 Jan 23 17:25:33.482405 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 23 17:25:33.482413 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jan 23 17:25:33.482422 kernel: raid6: neonx8 gen() 15764 MB/s Jan 23 17:25:33.482432 kernel: raid6: neonx4 gen() 15745 MB/s Jan 23 17:25:33.482440 kernel: raid6: neonx2 gen() 13209 MB/s Jan 23 17:25:33.482448 kernel: raid6: neonx1 gen() 10409 MB/s Jan 23 17:25:33.482456 kernel: raid6: int64x8 gen() 6830 MB/s Jan 23 17:25:33.482465 kernel: raid6: int64x4 gen() 7337 MB/s Jan 23 17:25:33.482473 kernel: raid6: int64x2 gen() 6102 MB/s Jan 23 17:25:33.482481 kernel: raid6: int64x1 gen() 5049 MB/s Jan 23 17:25:33.482491 kernel: raid6: using algorithm neonx8 gen() 15764 MB/s Jan 23 17:25:33.482501 kernel: raid6: .... xor() 12082 MB/s, rmw enabled Jan 23 17:25:33.482510 kernel: raid6: using neon recovery algorithm Jan 23 17:25:33.482518 kernel: xor: measuring software checksum speed Jan 23 17:25:33.482528 kernel: 8regs : 21618 MB/sec Jan 23 17:25:33.482537 kernel: 32regs : 21693 MB/sec Jan 23 17:25:33.482545 kernel: arm64_neon : 27984 MB/sec Jan 23 17:25:33.482555 kernel: xor: using function: arm64_neon (27984 MB/sec) Jan 23 17:25:33.482687 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 23 17:25:33.482702 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 23 17:25:33.482711 kernel: BTRFS: device fsid 8d2a73a7-ed2a-4757-891b-9df844aa914e devid 1 transid 35 /dev/mapper/usr (253:0) scanned by mount (276) Jan 23 17:25:33.482720 kernel: BTRFS info (device dm-0): first mount of filesystem 8d2a73a7-ed2a-4757-891b-9df844aa914e Jan 23 17:25:33.482729 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jan 23 17:25:33.482741 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 23 17:25:33.482750 kernel: BTRFS info (device dm-0): enabling free space tree Jan 23 17:25:33.482758 kernel: loop: module loaded Jan 23 17:25:33.482767 kernel: loop0: detected capacity change from 0 to 91840 Jan 23 17:25:33.482775 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 23 17:25:33.482885 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jan 23 17:25:33.482900 systemd[1]: Successfully made /usr/ read-only. Jan 23 17:25:33.482911 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 23 17:25:33.482920 systemd[1]: Detected virtualization kvm. Jan 23 17:25:33.482929 systemd[1]: Detected architecture arm64. Jan 23 17:25:33.482937 systemd[1]: Running in initrd. Jan 23 17:25:33.482945 systemd[1]: No hostname configured, using default hostname. Jan 23 17:25:33.482956 systemd[1]: Hostname set to . Jan 23 17:25:33.482965 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 23 17:25:33.482973 systemd[1]: Queued start job for default target initrd.target. Jan 23 17:25:33.482982 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 23 17:25:33.482991 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 17:25:33.483000 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 17:25:33.483011 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 23 17:25:33.483020 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 23 17:25:33.483029 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 23 17:25:33.483038 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 23 17:25:33.483047 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 17:25:33.483056 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 23 17:25:33.483067 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 23 17:25:33.483076 systemd[1]: Reached target paths.target - Path Units. Jan 23 17:25:33.483086 systemd[1]: Reached target slices.target - Slice Units. Jan 23 17:25:33.483095 systemd[1]: Reached target swap.target - Swaps. Jan 23 17:25:33.483104 systemd[1]: Reached target timers.target - Timer Units. Jan 23 17:25:33.483112 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 23 17:25:33.483121 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 23 17:25:33.483132 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 23 17:25:33.483140 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 23 17:25:33.483149 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 23 17:25:33.483158 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 23 17:25:33.483167 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 23 17:25:33.483176 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 17:25:33.483185 systemd[1]: Reached target sockets.target - Socket Units. Jan 23 17:25:33.483195 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 23 17:25:33.483204 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 23 17:25:33.483213 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 23 17:25:33.483222 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 23 17:25:33.483231 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 23 17:25:33.483240 systemd[1]: Starting systemd-fsck-usr.service... Jan 23 17:25:33.483250 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 23 17:25:33.483258 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 23 17:25:33.483268 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 17:25:33.483277 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 17:25:33.483287 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 23 17:25:33.483296 systemd[1]: Finished systemd-fsck-usr.service. Jan 23 17:25:33.483325 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 23 17:25:33.483359 systemd-journald[419]: Collecting audit messages is enabled. Jan 23 17:25:33.483383 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 17:25:33.483392 kernel: audit: type=1130 audit(1769189133.421:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.483402 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 23 17:25:33.483411 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 23 17:25:33.483419 kernel: Bridge firewalling registered Jan 23 17:25:33.483430 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 23 17:25:33.483439 kernel: audit: type=1130 audit(1769189133.431:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.483448 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 23 17:25:33.483457 kernel: audit: type=1130 audit(1769189133.435:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.483466 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 23 17:25:33.483475 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 23 17:25:33.483484 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 17:25:33.483495 kernel: audit: type=1130 audit(1769189133.454:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.483504 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 23 17:25:33.483513 kernel: audit: type=1130 audit(1769189133.459:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.483522 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 23 17:25:33.483531 kernel: audit: type=1130 audit(1769189133.463:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.483540 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 23 17:25:33.483550 kernel: audit: type=1334 audit(1769189133.467:8): prog-id=6 op=LOAD Jan 23 17:25:33.483564 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 23 17:25:33.483574 systemd-journald[419]: Journal started Jan 23 17:25:33.483592 systemd-journald[419]: Runtime Journal (/run/log/journal/99e74ae68da347feb6c5362e2b6ec98a) is 8M, max 319.5M, 311.5M free. Jan 23 17:25:33.421000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.431000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.435000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.454000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.459000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.463000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.467000 audit: BPF prog-id=6 op=LOAD Jan 23 17:25:33.429210 systemd-modules-load[420]: Inserted module 'br_netfilter' Jan 23 17:25:33.486023 systemd[1]: Started systemd-journald.service - Journal Service. Jan 23 17:25:33.485000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.488333 kernel: audit: type=1130 audit(1769189133.485:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.489880 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 23 17:25:33.493298 dracut-cmdline[447]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=35f959b0e84cd72dec35dcaa9fdae098b059a7436b8ff34bc604c87ac6375079 Jan 23 17:25:33.512677 systemd-tmpfiles[472]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 23 17:25:33.516797 systemd-resolved[448]: Positive Trust Anchors: Jan 23 17:25:33.516816 systemd-resolved[448]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 23 17:25:33.516819 systemd-resolved[448]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 23 17:25:33.523814 kernel: audit: type=1130 audit(1769189133.520:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.520000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.516849 systemd-resolved[448]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 23 17:25:33.518783 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 17:25:33.544770 systemd-resolved[448]: Defaulting to hostname 'linux'. Jan 23 17:25:33.545756 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 23 17:25:33.546000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.546775 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 23 17:25:33.582369 kernel: Loading iSCSI transport class v2.0-870. Jan 23 17:25:33.592322 kernel: iscsi: registered transport (tcp) Jan 23 17:25:33.606346 kernel: iscsi: registered transport (qla4xxx) Jan 23 17:25:33.606407 kernel: QLogic iSCSI HBA Driver Jan 23 17:25:33.627679 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 23 17:25:33.662123 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 17:25:33.663000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.664247 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 23 17:25:33.707824 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 23 17:25:33.708000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.710134 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 23 17:25:33.711718 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 23 17:25:33.749070 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 23 17:25:33.749000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.750000 audit: BPF prog-id=7 op=LOAD Jan 23 17:25:33.750000 audit: BPF prog-id=8 op=LOAD Jan 23 17:25:33.751744 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 17:25:33.785135 systemd-udevd[701]: Using default interface naming scheme 'v257'. Jan 23 17:25:33.793147 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 17:25:33.794000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.795804 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 23 17:25:33.815741 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 23 17:25:33.816000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.817000 audit: BPF prog-id=9 op=LOAD Jan 23 17:25:33.818265 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 23 17:25:33.822099 dracut-pre-trigger[776]: rd.md=0: removing MD RAID activation Jan 23 17:25:33.846076 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 23 17:25:33.846000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.848060 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 23 17:25:33.861972 systemd-networkd[809]: lo: Link UP Jan 23 17:25:33.861982 systemd-networkd[809]: lo: Gained carrier Jan 23 17:25:33.863000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.862622 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 23 17:25:33.863964 systemd[1]: Reached target network.target - Network. Jan 23 17:25:33.937065 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 17:25:33.937000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:33.939625 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 23 17:25:34.012927 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 23 17:25:34.018263 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jan 23 17:25:34.018411 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 23 17:25:34.021422 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jan 23 17:25:34.027239 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 23 17:25:34.034611 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 23 17:25:34.042984 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 23 17:25:34.045618 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 23 17:25:34.063742 disk-uuid[880]: Primary Header is updated. Jan 23 17:25:34.063742 disk-uuid[880]: Secondary Entries is updated. Jan 23 17:25:34.063742 disk-uuid[880]: Secondary Header is updated. Jan 23 17:25:34.068430 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 23 17:25:34.068556 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 17:25:34.072000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:34.072963 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 17:25:34.076764 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 17:25:34.076945 systemd-networkd[809]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 17:25:34.083632 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jan 23 17:25:34.085173 kernel: usbcore: registered new interface driver usbhid Jan 23 17:25:34.085276 kernel: usbhid: USB HID core driver Jan 23 17:25:34.076948 systemd-networkd[809]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 23 17:25:34.077384 systemd-networkd[809]: eth0: Link UP Jan 23 17:25:34.078485 systemd-networkd[809]: eth0: Gained carrier Jan 23 17:25:34.078498 systemd-networkd[809]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 17:25:34.112766 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 17:25:34.113000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:34.150378 systemd-networkd[809]: eth0: DHCPv4 address 10.0.1.37/25, gateway 10.0.1.1 acquired from 10.0.1.1 Jan 23 17:25:34.159387 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 23 17:25:34.160000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:34.160635 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 23 17:25:34.162290 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 17:25:34.164172 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 23 17:25:34.166771 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 23 17:25:34.201555 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 23 17:25:34.202000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:35.108415 disk-uuid[882]: Warning: The kernel is still using the old partition table. Jan 23 17:25:35.108415 disk-uuid[882]: The new table will be used at the next reboot or after you Jan 23 17:25:35.108415 disk-uuid[882]: run partprobe(8) or kpartx(8) Jan 23 17:25:35.108415 disk-uuid[882]: The operation has completed successfully. Jan 23 17:25:35.112934 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 23 17:25:35.113000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:35.113000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:35.113039 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 23 17:25:35.115262 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 23 17:25:35.149326 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (912) Jan 23 17:25:35.151909 kernel: BTRFS info (device vda6): first mount of filesystem 604c215e-feca-417a-a119-9b36e3a162e8 Jan 23 17:25:35.151965 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 23 17:25:35.156345 kernel: BTRFS info (device vda6): turning on async discard Jan 23 17:25:35.156403 kernel: BTRFS info (device vda6): enabling free space tree Jan 23 17:25:35.161335 kernel: BTRFS info (device vda6): last unmount of filesystem 604c215e-feca-417a-a119-9b36e3a162e8 Jan 23 17:25:35.162315 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 23 17:25:35.162000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:35.164369 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 23 17:25:35.311556 ignition[931]: Ignition 2.24.0 Jan 23 17:25:35.311573 ignition[931]: Stage: fetch-offline Jan 23 17:25:35.311609 ignition[931]: no configs at "/usr/lib/ignition/base.d" Jan 23 17:25:35.311619 ignition[931]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 17:25:35.313772 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 23 17:25:35.316000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:35.311774 ignition[931]: parsed url from cmdline: "" Jan 23 17:25:35.311777 ignition[931]: no config URL provided Jan 23 17:25:35.311781 ignition[931]: reading system config file "/usr/lib/ignition/user.ign" Jan 23 17:25:35.311789 ignition[931]: no config at "/usr/lib/ignition/user.ign" Jan 23 17:25:35.311794 ignition[931]: failed to fetch config: resource requires networking Jan 23 17:25:35.319447 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 23 17:25:35.311935 ignition[931]: Ignition finished successfully Jan 23 17:25:35.356080 ignition[942]: Ignition 2.24.0 Jan 23 17:25:35.356100 ignition[942]: Stage: fetch Jan 23 17:25:35.356240 ignition[942]: no configs at "/usr/lib/ignition/base.d" Jan 23 17:25:35.356248 ignition[942]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 17:25:35.356348 ignition[942]: parsed url from cmdline: "" Jan 23 17:25:35.356352 ignition[942]: no config URL provided Jan 23 17:25:35.356356 ignition[942]: reading system config file "/usr/lib/ignition/user.ign" Jan 23 17:25:35.356362 ignition[942]: no config at "/usr/lib/ignition/user.ign" Jan 23 17:25:35.356744 ignition[942]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 23 17:25:35.356759 ignition[942]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 23 17:25:35.356990 ignition[942]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 23 17:25:35.973636 systemd-networkd[809]: eth0: Gained IPv6LL Jan 23 17:25:36.357395 ignition[942]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 23 17:25:36.357506 ignition[942]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 23 17:25:37.358237 ignition[942]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 23 17:25:37.358290 ignition[942]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 23 17:25:38.289837 ignition[942]: GET result: OK Jan 23 17:25:38.289952 ignition[942]: parsing config with SHA512: a72ab3aa7a3ba6165bf9dfb7be5d77d6f973e35ec1dce1383be99a1732ff1400370e074586f581e9a8df23965c5f01decdf974e9750242cc64e3c92bb08e94cc Jan 23 17:25:38.295288 unknown[942]: fetched base config from "system" Jan 23 17:25:38.295298 unknown[942]: fetched base config from "system" Jan 23 17:25:38.295650 ignition[942]: fetch: fetch complete Jan 23 17:25:38.295315 unknown[942]: fetched user config from "openstack" Jan 23 17:25:38.295654 ignition[942]: fetch: fetch passed Jan 23 17:25:38.303150 kernel: kauditd_printk_skb: 20 callbacks suppressed Jan 23 17:25:38.303175 kernel: audit: type=1130 audit(1769189138.299:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:38.299000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:38.298613 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 23 17:25:38.295695 ignition[942]: Ignition finished successfully Jan 23 17:25:38.300973 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 23 17:25:38.335230 ignition[950]: Ignition 2.24.0 Jan 23 17:25:38.335247 ignition[950]: Stage: kargs Jan 23 17:25:38.335419 ignition[950]: no configs at "/usr/lib/ignition/base.d" Jan 23 17:25:38.335428 ignition[950]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 17:25:38.336184 ignition[950]: kargs: kargs passed Jan 23 17:25:38.336228 ignition[950]: Ignition finished successfully Jan 23 17:25:38.340175 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 23 17:25:38.340000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:38.344126 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 23 17:25:38.345684 kernel: audit: type=1130 audit(1769189138.340:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:38.368750 ignition[957]: Ignition 2.24.0 Jan 23 17:25:38.368769 ignition[957]: Stage: disks Jan 23 17:25:38.368916 ignition[957]: no configs at "/usr/lib/ignition/base.d" Jan 23 17:25:38.371739 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 23 17:25:38.368925 ignition[957]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 17:25:38.376988 kernel: audit: type=1130 audit(1769189138.372:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:38.372000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:38.372749 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 23 17:25:38.369678 ignition[957]: disks: disks passed Jan 23 17:25:38.376060 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 23 17:25:38.369722 ignition[957]: Ignition finished successfully Jan 23 17:25:38.377901 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 23 17:25:38.379378 systemd[1]: Reached target sysinit.target - System Initialization. Jan 23 17:25:38.380516 systemd[1]: Reached target basic.target - Basic System. Jan 23 17:25:38.383010 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 23 17:25:38.435926 systemd-fsck[966]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 23 17:25:38.442943 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 23 17:25:38.444000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:38.445251 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 23 17:25:38.449118 kernel: audit: type=1130 audit(1769189138.444:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:38.542328 kernel: EXT4-fs (vda9): mounted filesystem 6e8555bb-6998-46ec-8ba6-5a7a415f09ac r/w with ordered data mode. Quota mode: none. Jan 23 17:25:38.542601 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 23 17:25:38.543680 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 23 17:25:38.546783 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 23 17:25:38.572254 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 23 17:25:38.573243 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 23 17:25:38.574004 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 23 17:25:38.576363 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 23 17:25:38.576399 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 23 17:25:38.579649 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 23 17:25:38.582122 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 23 17:25:38.589332 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (975) Jan 23 17:25:38.593432 kernel: BTRFS info (device vda6): first mount of filesystem 604c215e-feca-417a-a119-9b36e3a162e8 Jan 23 17:25:38.593474 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 23 17:25:38.598738 kernel: BTRFS info (device vda6): turning on async discard Jan 23 17:25:38.599022 kernel: BTRFS info (device vda6): enabling free space tree Jan 23 17:25:38.601038 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 23 17:25:38.638378 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 17:25:38.739684 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 23 17:25:38.740000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:38.744578 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 23 17:25:38.746139 kernel: audit: type=1130 audit(1769189138.740:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:38.746019 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 23 17:25:38.763228 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 23 17:25:38.765076 kernel: BTRFS info (device vda6): last unmount of filesystem 604c215e-feca-417a-a119-9b36e3a162e8 Jan 23 17:25:38.783359 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 23 17:25:38.783000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:38.787327 kernel: audit: type=1130 audit(1769189138.783:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:38.790381 ignition[1076]: INFO : Ignition 2.24.0 Jan 23 17:25:38.790381 ignition[1076]: INFO : Stage: mount Jan 23 17:25:38.792499 ignition[1076]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 17:25:38.792499 ignition[1076]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 17:25:38.792499 ignition[1076]: INFO : mount: mount passed Jan 23 17:25:38.792499 ignition[1076]: INFO : Ignition finished successfully Jan 23 17:25:38.798547 kernel: audit: type=1130 audit(1769189138.794:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:38.794000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:38.793441 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 23 17:25:39.672400 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 17:25:41.681406 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 17:25:45.686409 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 17:25:45.690347 coreos-metadata[977]: Jan 23 17:25:45.690 WARN failed to locate config-drive, using the metadata service API instead Jan 23 17:25:45.709561 coreos-metadata[977]: Jan 23 17:25:45.709 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 23 17:25:47.307264 coreos-metadata[977]: Jan 23 17:25:47.307 INFO Fetch successful Jan 23 17:25:47.308365 coreos-metadata[977]: Jan 23 17:25:47.307 INFO wrote hostname ci-4547-1-0-a-d0877fd079 to /sysroot/etc/hostname Jan 23 17:25:47.311161 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 23 17:25:47.320899 kernel: audit: type=1130 audit(1769189147.314:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:47.320934 kernel: audit: type=1131 audit(1769189147.314:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:47.314000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:47.314000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:47.311271 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 23 17:25:47.320888 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 23 17:25:47.340771 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 23 17:25:47.364325 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1095) Jan 23 17:25:47.367569 kernel: BTRFS info (device vda6): first mount of filesystem 604c215e-feca-417a-a119-9b36e3a162e8 Jan 23 17:25:47.367623 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 23 17:25:47.371768 kernel: BTRFS info (device vda6): turning on async discard Jan 23 17:25:47.371832 kernel: BTRFS info (device vda6): enabling free space tree Jan 23 17:25:47.373354 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 23 17:25:47.397363 ignition[1113]: INFO : Ignition 2.24.0 Jan 23 17:25:47.397363 ignition[1113]: INFO : Stage: files Jan 23 17:25:47.398984 ignition[1113]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 17:25:47.398984 ignition[1113]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 17:25:47.398984 ignition[1113]: DEBUG : files: compiled without relabeling support, skipping Jan 23 17:25:47.402257 ignition[1113]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 23 17:25:47.402257 ignition[1113]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 23 17:25:47.406433 ignition[1113]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 23 17:25:47.407636 ignition[1113]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 23 17:25:47.407636 ignition[1113]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 23 17:25:47.406973 unknown[1113]: wrote ssh authorized keys file for user: core Jan 23 17:25:47.410766 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 23 17:25:47.410766 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Jan 23 17:25:47.469014 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 23 17:25:47.573668 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 23 17:25:47.573668 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 23 17:25:47.577042 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 23 17:25:47.577042 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 23 17:25:47.577042 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 23 17:25:47.577042 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 23 17:25:47.577042 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 23 17:25:47.577042 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 23 17:25:47.577042 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 23 17:25:47.587409 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 23 17:25:47.587409 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 23 17:25:47.587409 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Jan 23 17:25:47.587409 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Jan 23 17:25:47.587409 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Jan 23 17:25:47.587409 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-arm64.raw: attempt #1 Jan 23 17:25:47.948725 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 23 17:25:49.658876 ignition[1113]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Jan 23 17:25:49.658876 ignition[1113]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 23 17:25:49.662262 ignition[1113]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 23 17:25:49.664270 ignition[1113]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 23 17:25:49.664270 ignition[1113]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 23 17:25:49.664270 ignition[1113]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 23 17:25:49.667718 ignition[1113]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 23 17:25:49.667718 ignition[1113]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 23 17:25:49.667718 ignition[1113]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 23 17:25:49.667718 ignition[1113]: INFO : files: files passed Jan 23 17:25:49.667718 ignition[1113]: INFO : Ignition finished successfully Jan 23 17:25:49.676931 kernel: audit: type=1130 audit(1769189149.668:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.668000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.667413 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 23 17:25:49.671920 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 23 17:25:49.674177 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 23 17:25:49.689836 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 23 17:25:49.689960 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 23 17:25:49.696338 kernel: audit: type=1130 audit(1769189149.691:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.696368 kernel: audit: type=1131 audit(1769189149.693:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.691000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.693000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.696453 initrd-setup-root-after-ignition[1146]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 23 17:25:49.696453 initrd-setup-root-after-ignition[1146]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 23 17:25:49.699240 initrd-setup-root-after-ignition[1150]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 23 17:25:49.701357 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 23 17:25:49.702000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.702576 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 23 17:25:49.707019 kernel: audit: type=1130 audit(1769189149.702:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.707001 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 23 17:25:49.745862 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 23 17:25:49.745990 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 23 17:25:49.752674 kernel: audit: type=1130 audit(1769189149.747:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.752700 kernel: audit: type=1131 audit(1769189149.747:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.747000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.747000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.747822 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 23 17:25:49.753457 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 23 17:25:49.755094 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 23 17:25:49.755985 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 23 17:25:49.791711 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 23 17:25:49.792000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.794077 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 23 17:25:49.797413 kernel: audit: type=1130 audit(1769189149.792:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.815262 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 23 17:25:49.815468 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 23 17:25:49.817202 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 17:25:49.818947 systemd[1]: Stopped target timers.target - Timer Units. Jan 23 17:25:49.820370 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 23 17:25:49.821000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.820495 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 23 17:25:49.825402 kernel: audit: type=1131 audit(1769189149.821:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.824625 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 23 17:25:49.826172 systemd[1]: Stopped target basic.target - Basic System. Jan 23 17:25:49.827570 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 23 17:25:49.829004 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 23 17:25:49.830688 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 23 17:25:49.832171 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 23 17:25:49.833770 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 23 17:25:49.835219 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 23 17:25:49.836864 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 23 17:25:49.838481 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 23 17:25:49.839896 systemd[1]: Stopped target swap.target - Swaps. Jan 23 17:25:49.841109 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 23 17:25:49.842000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.841241 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 23 17:25:49.843115 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 23 17:25:49.844759 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 17:25:49.846225 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 23 17:25:49.846321 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 17:25:49.849000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.848039 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 23 17:25:49.848161 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 23 17:25:49.851000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.850424 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 23 17:25:49.853000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.850540 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 23 17:25:49.852164 systemd[1]: ignition-files.service: Deactivated successfully. Jan 23 17:25:49.852261 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 23 17:25:49.857000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.854426 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 23 17:25:49.855974 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 23 17:25:49.856093 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 17:25:49.860000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.858392 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 23 17:25:49.862000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.859715 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 23 17:25:49.859842 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 17:25:49.864000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.861364 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 23 17:25:49.861472 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 17:25:49.863183 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 23 17:25:49.863290 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 23 17:25:49.868057 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 23 17:25:49.875391 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 23 17:25:49.877000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.877000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.886291 ignition[1170]: INFO : Ignition 2.24.0 Jan 23 17:25:49.886291 ignition[1170]: INFO : Stage: umount Jan 23 17:25:49.887874 ignition[1170]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 23 17:25:49.887874 ignition[1170]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 23 17:25:49.887874 ignition[1170]: INFO : umount: umount passed Jan 23 17:25:49.887874 ignition[1170]: INFO : Ignition finished successfully Jan 23 17:25:49.889000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.891000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.892000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.886949 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 23 17:25:49.894000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.888925 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 23 17:25:49.889040 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 23 17:25:49.896000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.890376 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 23 17:25:49.890429 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 23 17:25:49.891905 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 23 17:25:49.891946 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 23 17:25:49.893097 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 23 17:25:49.893141 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 23 17:25:49.894560 systemd[1]: Stopped target network.target - Network. Jan 23 17:25:49.895781 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 23 17:25:49.895827 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 23 17:25:49.897284 systemd[1]: Stopped target paths.target - Path Units. Jan 23 17:25:49.898617 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 23 17:25:49.902342 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 17:25:49.903982 systemd[1]: Stopped target slices.target - Slice Units. Jan 23 17:25:49.912000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.905184 systemd[1]: Stopped target sockets.target - Socket Units. Jan 23 17:25:49.913000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.906684 systemd[1]: iscsid.socket: Deactivated successfully. Jan 23 17:25:49.906724 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 23 17:25:49.908100 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 23 17:25:49.917000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.908129 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 23 17:25:49.919000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.909403 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 23 17:25:49.909424 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 23 17:25:49.910988 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 23 17:25:49.911042 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 23 17:25:49.912842 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 23 17:25:49.912883 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 23 17:25:49.914215 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 23 17:25:49.925000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.915535 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 23 17:25:49.917044 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 23 17:25:49.917138 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 23 17:25:49.918488 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 23 17:25:49.929000 audit: BPF prog-id=6 op=UNLOAD Jan 23 17:25:49.929000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.918579 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 23 17:25:49.924253 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 23 17:25:49.924375 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 23 17:25:49.928523 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 23 17:25:49.928607 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 23 17:25:49.932881 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 23 17:25:49.934082 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 23 17:25:49.937000 audit: BPF prog-id=9 op=UNLOAD Jan 23 17:25:49.934126 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 23 17:25:49.939000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.936345 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 23 17:25:49.940000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.937839 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 23 17:25:49.942000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.937894 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 23 17:25:49.939521 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 23 17:25:49.939562 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 23 17:25:49.941050 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 23 17:25:49.941088 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 23 17:25:49.942561 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 17:25:49.958182 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 23 17:25:49.958350 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 17:25:49.960000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.961230 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 23 17:25:49.961318 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 23 17:25:49.963047 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 23 17:25:49.963083 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 17:25:49.965000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.964544 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 23 17:25:49.964589 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 23 17:25:49.967000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.966796 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 23 17:25:49.966852 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 23 17:25:49.970000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.969031 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 23 17:25:49.969083 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 23 17:25:49.975340 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 23 17:25:49.976185 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 23 17:25:49.978000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.976250 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 17:25:49.979000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.978144 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 23 17:25:49.981000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.978207 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 17:25:49.983000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.979927 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 23 17:25:49.979974 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 17:25:49.982455 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 23 17:25:49.982550 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 23 17:25:49.987000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.987000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:49.986619 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 23 17:25:49.986728 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 23 17:25:49.988352 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 23 17:25:49.990231 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 23 17:25:50.008536 systemd[1]: Switching root. Jan 23 17:25:50.042009 systemd-journald[419]: Journal stopped Jan 23 17:25:50.900744 systemd-journald[419]: Received SIGTERM from PID 1 (systemd). Jan 23 17:25:50.900820 kernel: SELinux: policy capability network_peer_controls=1 Jan 23 17:25:50.900845 kernel: SELinux: policy capability open_perms=1 Jan 23 17:25:50.900857 kernel: SELinux: policy capability extended_socket_class=1 Jan 23 17:25:50.900873 kernel: SELinux: policy capability always_check_network=0 Jan 23 17:25:50.900885 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 23 17:25:50.900897 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 23 17:25:50.900907 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 23 17:25:50.900917 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 23 17:25:50.900927 kernel: SELinux: policy capability userspace_initial_context=0 Jan 23 17:25:50.900937 systemd[1]: Successfully loaded SELinux policy in 65.493ms. Jan 23 17:25:50.900956 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.863ms. Jan 23 17:25:50.900968 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 23 17:25:50.900979 systemd[1]: Detected virtualization kvm. Jan 23 17:25:50.900990 systemd[1]: Detected architecture arm64. Jan 23 17:25:50.901000 systemd[1]: Detected first boot. Jan 23 17:25:50.901011 systemd[1]: Hostname set to . Jan 23 17:25:50.901023 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 23 17:25:50.901033 zram_generator::config[1214]: No configuration found. Jan 23 17:25:50.901052 kernel: NET: Registered PF_VSOCK protocol family Jan 23 17:25:50.901062 systemd[1]: Populated /etc with preset unit settings. Jan 23 17:25:50.901075 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 23 17:25:50.901089 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 23 17:25:50.901099 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 23 17:25:50.901111 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 23 17:25:50.901122 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 23 17:25:50.901133 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 23 17:25:50.901146 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 23 17:25:50.901157 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 23 17:25:50.901167 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 23 17:25:50.901178 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 23 17:25:50.901189 systemd[1]: Created slice user.slice - User and Session Slice. Jan 23 17:25:50.901200 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 23 17:25:50.901213 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 23 17:25:50.901224 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 23 17:25:50.901234 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 23 17:25:50.901245 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 23 17:25:50.901256 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 23 17:25:50.901267 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jan 23 17:25:50.901278 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 23 17:25:50.901293 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 23 17:25:50.901315 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 23 17:25:50.901341 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 23 17:25:50.901354 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 23 17:25:50.901365 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 23 17:25:50.901375 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 23 17:25:50.901388 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 23 17:25:50.901399 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 23 17:25:50.901410 systemd[1]: Reached target slices.target - Slice Units. Jan 23 17:25:50.901420 systemd[1]: Reached target swap.target - Swaps. Jan 23 17:25:50.901432 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 23 17:25:50.901443 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 23 17:25:50.901454 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 23 17:25:50.901464 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 23 17:25:50.901475 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 23 17:25:50.901485 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 23 17:25:50.901496 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 23 17:25:50.901508 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 23 17:25:50.901518 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 23 17:25:50.901529 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 23 17:25:50.901540 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 23 17:25:50.901552 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 23 17:25:50.901563 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 23 17:25:50.901573 systemd[1]: Mounting media.mount - External Media Directory... Jan 23 17:25:50.901585 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 23 17:25:50.901599 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 23 17:25:50.901610 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 23 17:25:50.901621 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 23 17:25:50.901632 systemd[1]: Reached target machines.target - Containers. Jan 23 17:25:50.901643 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 23 17:25:50.901653 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 23 17:25:50.901665 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 23 17:25:50.901676 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 23 17:25:50.901687 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 23 17:25:50.901697 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 23 17:25:50.901709 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 23 17:25:50.901720 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 23 17:25:50.901731 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 23 17:25:50.901742 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 23 17:25:50.901752 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 23 17:25:50.901763 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 23 17:25:50.901775 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 23 17:25:50.901787 systemd[1]: Stopped systemd-fsck-usr.service. Jan 23 17:25:50.901798 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 23 17:25:50.901808 kernel: fuse: init (API version 7.41) Jan 23 17:25:50.901819 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 23 17:25:50.901830 kernel: ACPI: bus type drm_connector registered Jan 23 17:25:50.901840 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 23 17:25:50.901853 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 23 17:25:50.901864 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 23 17:25:50.901874 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 23 17:25:50.901885 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 23 17:25:50.901896 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 23 17:25:50.901906 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 23 17:25:50.901942 systemd-journald[1281]: Collecting audit messages is enabled. Jan 23 17:25:50.901971 systemd[1]: Mounted media.mount - External Media Directory. Jan 23 17:25:50.901982 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 23 17:25:50.901993 systemd-journald[1281]: Journal started Jan 23 17:25:50.902014 systemd-journald[1281]: Runtime Journal (/run/log/journal/99e74ae68da347feb6c5362e2b6ec98a) is 8M, max 319.5M, 311.5M free. Jan 23 17:25:50.772000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 23 17:25:50.853000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.855000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.858000 audit: BPF prog-id=14 op=UNLOAD Jan 23 17:25:50.858000 audit: BPF prog-id=13 op=UNLOAD Jan 23 17:25:50.860000 audit: BPF prog-id=15 op=LOAD Jan 23 17:25:50.860000 audit: BPF prog-id=16 op=LOAD Jan 23 17:25:50.860000 audit: BPF prog-id=17 op=LOAD Jan 23 17:25:50.897000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 23 17:25:50.897000 audit[1281]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=6 a1=ffffc955ef50 a2=4000 a3=0 items=0 ppid=1 pid=1281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:25:50.897000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 23 17:25:50.687363 systemd[1]: Queued start job for default target multi-user.target. Jan 23 17:25:50.708601 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 23 17:25:50.709087 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 23 17:25:50.904868 systemd[1]: Started systemd-journald.service - Journal Service. Jan 23 17:25:50.904000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.905810 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 23 17:25:50.907013 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 23 17:25:50.909000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.908424 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 23 17:25:50.909703 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 23 17:25:50.909864 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 23 17:25:50.910000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.910000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.911172 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 23 17:25:50.911346 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 23 17:25:50.911000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.912000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.912698 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 23 17:25:50.912898 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 23 17:25:50.913000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.913000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.914444 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 23 17:25:50.914612 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 23 17:25:50.915000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.915000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.916202 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 23 17:25:50.916373 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 23 17:25:50.916000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.916000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.917536 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 23 17:25:50.917692 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 23 17:25:50.918000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.918000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.919006 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 23 17:25:50.919000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.920247 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 23 17:25:50.920000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.921601 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 23 17:25:50.922000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.923465 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 23 17:25:50.924000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.924885 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 23 17:25:50.925000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.936798 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 23 17:25:50.938640 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 23 17:25:50.940671 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 23 17:25:50.942575 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 23 17:25:50.943503 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 23 17:25:50.943530 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 23 17:25:50.945126 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 23 17:25:50.946416 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 23 17:25:50.946523 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 23 17:25:50.953466 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 23 17:25:50.955262 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 23 17:25:50.956318 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 23 17:25:50.957272 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 23 17:25:50.958356 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 23 17:25:50.961452 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 23 17:25:50.964502 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 23 17:25:50.966494 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 23 17:25:50.968724 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 23 17:25:50.970075 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 23 17:25:50.971275 systemd-journald[1281]: Time spent on flushing to /var/log/journal/99e74ae68da347feb6c5362e2b6ec98a is 34.683ms for 1817 entries. Jan 23 17:25:50.971275 systemd-journald[1281]: System Journal (/var/log/journal/99e74ae68da347feb6c5362e2b6ec98a) is 8M, max 588.1M, 580.1M free. Jan 23 17:25:51.026108 systemd-journald[1281]: Received client request to flush runtime journal. Jan 23 17:25:51.026158 kernel: loop1: detected capacity change from 0 to 100192 Jan 23 17:25:50.974000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.996000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.001000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:50.973339 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 23 17:25:50.976009 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 23 17:25:50.980561 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 23 17:25:50.996031 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 23 17:25:51.000036 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 23 17:25:51.026099 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 23 17:25:51.027000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.029086 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 23 17:25:51.030000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.032000 audit: BPF prog-id=18 op=LOAD Jan 23 17:25:51.033389 kernel: loop2: detected capacity change from 0 to 45344 Jan 23 17:25:51.033000 audit: BPF prog-id=19 op=LOAD Jan 23 17:25:51.033000 audit: BPF prog-id=20 op=LOAD Jan 23 17:25:51.035000 audit: BPF prog-id=21 op=LOAD Jan 23 17:25:51.043000 audit: BPF prog-id=22 op=LOAD Jan 23 17:25:51.043000 audit: BPF prog-id=23 op=LOAD Jan 23 17:25:51.043000 audit: BPF prog-id=24 op=LOAD Jan 23 17:25:51.045000 audit: BPF prog-id=25 op=LOAD Jan 23 17:25:51.045000 audit: BPF prog-id=26 op=LOAD Jan 23 17:25:51.045000 audit: BPF prog-id=27 op=LOAD Jan 23 17:25:51.081000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.084000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.086000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.034351 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 23 17:25:51.036526 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 23 17:25:51.041494 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 23 17:25:51.044436 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 23 17:25:51.046579 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 23 17:25:51.076777 systemd-tmpfiles[1352]: ACLs are not supported, ignoring. Jan 23 17:25:51.076788 systemd-tmpfiles[1352]: ACLs are not supported, ignoring. Jan 23 17:25:51.080511 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 23 17:25:51.081979 systemd-nsresourced[1353]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 23 17:25:51.082910 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 23 17:25:51.084560 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 23 17:25:51.154322 systemd-oomd[1350]: No swap; memory pressure usage will be degraded Jan 23 17:25:51.154840 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 23 17:25:51.155000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.165444 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 23 17:25:51.166497 systemd-resolved[1351]: Positive Trust Anchors: Jan 23 17:25:51.166513 systemd-resolved[1351]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 23 17:25:51.166517 systemd-resolved[1351]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 23 17:25:51.166000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.166555 systemd-resolved[1351]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 23 17:25:51.174329 kernel: loop3: detected capacity change from 0 to 200800 Jan 23 17:25:51.177382 systemd-resolved[1351]: Using system hostname 'ci-4547-1-0-a-d0877fd079'. Jan 23 17:25:51.178702 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 23 17:25:51.179000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.179713 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 23 17:25:51.224340 kernel: loop4: detected capacity change from 0 to 1648 Jan 23 17:25:51.255347 kernel: loop5: detected capacity change from 0 to 100192 Jan 23 17:25:51.268396 kernel: loop6: detected capacity change from 0 to 45344 Jan 23 17:25:51.285322 kernel: loop7: detected capacity change from 0 to 200800 Jan 23 17:25:51.305336 kernel: loop1: detected capacity change from 0 to 1648 Jan 23 17:25:51.309821 (sd-merge)[1377]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Jan 23 17:25:51.312872 (sd-merge)[1377]: Merged extensions into '/usr'. Jan 23 17:25:51.317074 systemd[1]: Reload requested from client PID 1334 ('systemd-sysext') (unit systemd-sysext.service)... Jan 23 17:25:51.317099 systemd[1]: Reloading... Jan 23 17:25:51.373382 zram_generator::config[1410]: No configuration found. Jan 23 17:25:51.518176 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 23 17:25:51.518600 systemd[1]: Reloading finished in 201 ms. Jan 23 17:25:51.548536 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 23 17:25:51.549000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.549889 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 23 17:25:51.551000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.568599 systemd[1]: Starting ensure-sysext.service... Jan 23 17:25:51.570324 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 23 17:25:51.570000 audit: BPF prog-id=8 op=UNLOAD Jan 23 17:25:51.570000 audit: BPF prog-id=7 op=UNLOAD Jan 23 17:25:51.571000 audit: BPF prog-id=28 op=LOAD Jan 23 17:25:51.571000 audit: BPF prog-id=29 op=LOAD Jan 23 17:25:51.572626 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 23 17:25:51.573000 audit: BPF prog-id=30 op=LOAD Jan 23 17:25:51.573000 audit: BPF prog-id=21 op=UNLOAD Jan 23 17:25:51.574000 audit: BPF prog-id=31 op=LOAD Jan 23 17:25:51.574000 audit: BPF prog-id=22 op=UNLOAD Jan 23 17:25:51.574000 audit: BPF prog-id=32 op=LOAD Jan 23 17:25:51.574000 audit: BPF prog-id=33 op=LOAD Jan 23 17:25:51.574000 audit: BPF prog-id=23 op=UNLOAD Jan 23 17:25:51.574000 audit: BPF prog-id=24 op=UNLOAD Jan 23 17:25:51.575000 audit: BPF prog-id=34 op=LOAD Jan 23 17:25:51.575000 audit: BPF prog-id=18 op=UNLOAD Jan 23 17:25:51.575000 audit: BPF prog-id=35 op=LOAD Jan 23 17:25:51.575000 audit: BPF prog-id=36 op=LOAD Jan 23 17:25:51.575000 audit: BPF prog-id=19 op=UNLOAD Jan 23 17:25:51.575000 audit: BPF prog-id=20 op=UNLOAD Jan 23 17:25:51.576000 audit: BPF prog-id=37 op=LOAD Jan 23 17:25:51.576000 audit: BPF prog-id=25 op=UNLOAD Jan 23 17:25:51.576000 audit: BPF prog-id=38 op=LOAD Jan 23 17:25:51.576000 audit: BPF prog-id=39 op=LOAD Jan 23 17:25:51.576000 audit: BPF prog-id=26 op=UNLOAD Jan 23 17:25:51.576000 audit: BPF prog-id=27 op=UNLOAD Jan 23 17:25:51.577000 audit: BPF prog-id=40 op=LOAD Jan 23 17:25:51.577000 audit: BPF prog-id=15 op=UNLOAD Jan 23 17:25:51.577000 audit: BPF prog-id=41 op=LOAD Jan 23 17:25:51.577000 audit: BPF prog-id=42 op=LOAD Jan 23 17:25:51.577000 audit: BPF prog-id=16 op=UNLOAD Jan 23 17:25:51.577000 audit: BPF prog-id=17 op=UNLOAD Jan 23 17:25:51.582041 systemd[1]: Reload requested from client PID 1444 ('systemctl') (unit ensure-sysext.service)... Jan 23 17:25:51.582061 systemd[1]: Reloading... Jan 23 17:25:51.584073 systemd-tmpfiles[1445]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 23 17:25:51.584114 systemd-tmpfiles[1445]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 23 17:25:51.584374 systemd-tmpfiles[1445]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 23 17:25:51.585293 systemd-tmpfiles[1445]: ACLs are not supported, ignoring. Jan 23 17:25:51.585365 systemd-tmpfiles[1445]: ACLs are not supported, ignoring. Jan 23 17:25:51.595738 systemd-tmpfiles[1445]: Detected autofs mount point /boot during canonicalization of boot. Jan 23 17:25:51.595754 systemd-tmpfiles[1445]: Skipping /boot Jan 23 17:25:51.597580 systemd-udevd[1446]: Using default interface naming scheme 'v257'. Jan 23 17:25:51.602131 systemd-tmpfiles[1445]: Detected autofs mount point /boot during canonicalization of boot. Jan 23 17:25:51.602148 systemd-tmpfiles[1445]: Skipping /boot Jan 23 17:25:51.635459 zram_generator::config[1478]: No configuration found. Jan 23 17:25:51.745357 kernel: mousedev: PS/2 mouse device common for all mice Jan 23 17:25:51.789990 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Jan 23 17:25:51.790081 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 23 17:25:51.790104 kernel: [drm] features: -context_init Jan 23 17:25:51.791393 kernel: [drm] number of scanouts: 1 Jan 23 17:25:51.791477 kernel: [drm] number of cap sets: 0 Jan 23 17:25:51.804334 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Jan 23 17:25:51.810612 kernel: Console: switching to colour frame buffer device 160x50 Jan 23 17:25:51.814326 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 23 17:25:51.854630 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jan 23 17:25:51.854818 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 23 17:25:51.856274 systemd[1]: Reloading finished in 273 ms. Jan 23 17:25:51.867104 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 23 17:25:51.867000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.869000 audit: BPF prog-id=43 op=LOAD Jan 23 17:25:51.869000 audit: BPF prog-id=40 op=UNLOAD Jan 23 17:25:51.869000 audit: BPF prog-id=44 op=LOAD Jan 23 17:25:51.869000 audit: BPF prog-id=45 op=LOAD Jan 23 17:25:51.869000 audit: BPF prog-id=41 op=UNLOAD Jan 23 17:25:51.869000 audit: BPF prog-id=42 op=UNLOAD Jan 23 17:25:51.870000 audit: BPF prog-id=46 op=LOAD Jan 23 17:25:51.870000 audit: BPF prog-id=30 op=UNLOAD Jan 23 17:25:51.870000 audit: BPF prog-id=47 op=LOAD Jan 23 17:25:51.870000 audit: BPF prog-id=31 op=UNLOAD Jan 23 17:25:51.870000 audit: BPF prog-id=48 op=LOAD Jan 23 17:25:51.870000 audit: BPF prog-id=49 op=LOAD Jan 23 17:25:51.870000 audit: BPF prog-id=32 op=UNLOAD Jan 23 17:25:51.870000 audit: BPF prog-id=33 op=UNLOAD Jan 23 17:25:51.871000 audit: BPF prog-id=50 op=LOAD Jan 23 17:25:51.871000 audit: BPF prog-id=34 op=UNLOAD Jan 23 17:25:51.871000 audit: BPF prog-id=51 op=LOAD Jan 23 17:25:51.871000 audit: BPF prog-id=52 op=LOAD Jan 23 17:25:51.871000 audit: BPF prog-id=35 op=UNLOAD Jan 23 17:25:51.871000 audit: BPF prog-id=36 op=UNLOAD Jan 23 17:25:51.872000 audit: BPF prog-id=53 op=LOAD Jan 23 17:25:51.872000 audit: BPF prog-id=54 op=LOAD Jan 23 17:25:51.872000 audit: BPF prog-id=28 op=UNLOAD Jan 23 17:25:51.872000 audit: BPF prog-id=29 op=UNLOAD Jan 23 17:25:51.872000 audit: BPF prog-id=55 op=LOAD Jan 23 17:25:51.883000 audit: BPF prog-id=37 op=UNLOAD Jan 23 17:25:51.883000 audit: BPF prog-id=56 op=LOAD Jan 23 17:25:51.883000 audit: BPF prog-id=57 op=LOAD Jan 23 17:25:51.883000 audit: BPF prog-id=38 op=UNLOAD Jan 23 17:25:51.883000 audit: BPF prog-id=39 op=UNLOAD Jan 23 17:25:51.886275 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 23 17:25:51.886000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.907685 systemd[1]: Finished ensure-sysext.service. Jan 23 17:25:51.908000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.925256 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 23 17:25:51.927173 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 23 17:25:51.928321 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 23 17:25:51.929249 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 23 17:25:51.941472 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 23 17:25:51.946704 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 23 17:25:51.948820 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 23 17:25:51.950958 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 23 17:25:51.953664 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 23 17:25:51.953792 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 23 17:25:51.954927 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 23 17:25:51.957586 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 23 17:25:51.959523 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 23 17:25:51.962836 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 23 17:25:51.966000 audit: BPF prog-id=58 op=LOAD Jan 23 17:25:51.968172 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 23 17:25:51.970402 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 23 17:25:51.970461 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 23 17:25:51.970512 systemd[1]: Reached target time-set.target - System Time Set. Jan 23 17:25:51.974542 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 23 17:25:51.979360 kernel: PTP clock support registered Jan 23 17:25:51.976904 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 23 17:25:51.979323 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 23 17:25:51.982018 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 23 17:25:51.983000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.983000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.985576 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 23 17:25:51.988407 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 23 17:25:51.988000 audit[1591]: SYSTEM_BOOT pid=1591 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.988000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.988000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.990655 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 23 17:25:51.990862 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 23 17:25:51.991000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.991000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.992143 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 23 17:25:51.992364 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 23 17:25:51.992000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.992000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.993435 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 23 17:25:51.993608 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 23 17:25:51.996000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.996000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:51.996804 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 23 17:25:51.998000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:52.008893 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 23 17:25:52.009053 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 23 17:25:52.013128 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 23 17:25:52.014000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:52.027687 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 23 17:25:52.028000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:25:52.030000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 23 17:25:52.030000 audit[1611]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffddd7a0f0 a2=420 a3=0 items=0 ppid=1566 pid=1611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:25:52.030000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 23 17:25:52.031364 augenrules[1611]: No rules Jan 23 17:25:52.032751 systemd[1]: audit-rules.service: Deactivated successfully. Jan 23 17:25:52.033037 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 23 17:25:52.068097 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 23 17:25:52.069372 systemd-networkd[1585]: lo: Link UP Jan 23 17:25:52.069383 systemd-networkd[1585]: lo: Gained carrier Jan 23 17:25:52.070506 systemd-networkd[1585]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 17:25:52.070516 systemd-networkd[1585]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 23 17:25:52.071377 systemd-networkd[1585]: eth0: Link UP Jan 23 17:25:52.071576 systemd-networkd[1585]: eth0: Gained carrier Jan 23 17:25:52.071589 systemd-networkd[1585]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 23 17:25:52.072097 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 23 17:25:52.073350 systemd[1]: Reached target network.target - Network. Jan 23 17:25:52.075431 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 23 17:25:52.077409 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 23 17:25:52.087382 systemd-networkd[1585]: eth0: DHCPv4 address 10.0.1.37/25, gateway 10.0.1.1 acquired from 10.0.1.1 Jan 23 17:25:52.094411 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 23 17:25:52.095683 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 23 17:25:52.101945 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 23 17:25:52.522793 ldconfig[1579]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 23 17:25:52.528373 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 23 17:25:52.530675 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 23 17:25:52.557386 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 23 17:25:52.558516 systemd[1]: Reached target sysinit.target - System Initialization. Jan 23 17:25:52.559427 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 23 17:25:52.560399 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 23 17:25:52.561489 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 23 17:25:52.562400 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 23 17:25:52.563397 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 23 17:25:52.564411 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 23 17:25:52.565245 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 23 17:25:52.566351 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 23 17:25:52.566384 systemd[1]: Reached target paths.target - Path Units. Jan 23 17:25:52.567061 systemd[1]: Reached target timers.target - Timer Units. Jan 23 17:25:52.568789 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 23 17:25:52.570865 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 23 17:25:52.573463 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 23 17:25:52.574623 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 23 17:25:52.575612 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 23 17:25:52.579262 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 23 17:25:52.580398 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 23 17:25:52.581847 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 23 17:25:52.582845 systemd[1]: Reached target sockets.target - Socket Units. Jan 23 17:25:52.583630 systemd[1]: Reached target basic.target - Basic System. Jan 23 17:25:52.584380 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 23 17:25:52.584410 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 23 17:25:52.586716 systemd[1]: Starting chronyd.service - NTP client/server... Jan 23 17:25:52.588350 systemd[1]: Starting containerd.service - containerd container runtime... Jan 23 17:25:52.590399 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 23 17:25:52.593466 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 23 17:25:52.595079 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 23 17:25:52.597677 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 23 17:25:52.598325 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 17:25:52.602539 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 23 17:25:52.603504 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 23 17:25:52.605752 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 23 17:25:52.607461 jq[1635]: false Jan 23 17:25:52.607527 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 23 17:25:52.611292 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 23 17:25:52.613238 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 23 17:25:52.616490 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 23 17:25:52.617464 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 23 17:25:52.617847 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 23 17:25:52.618660 systemd[1]: Starting update-engine.service - Update Engine... Jan 23 17:25:52.623492 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 23 17:25:52.626938 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 23 17:25:52.628436 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 23 17:25:52.628669 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 23 17:25:52.630693 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 23 17:25:52.630912 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 23 17:25:52.638706 extend-filesystems[1638]: Found /dev/vda6 Jan 23 17:25:52.642822 systemd[1]: motdgen.service: Deactivated successfully. Jan 23 17:25:52.643078 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 23 17:25:52.644881 chronyd[1630]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 23 17:25:52.647853 extend-filesystems[1638]: Found /dev/vda9 Jan 23 17:25:52.649702 chronyd[1630]: Loaded seccomp filter (level 2) Jan 23 17:25:52.650489 jq[1649]: true Jan 23 17:25:52.650599 systemd[1]: Started chronyd.service - NTP client/server. Jan 23 17:25:52.652433 extend-filesystems[1638]: Checking size of /dev/vda9 Jan 23 17:25:52.657957 update_engine[1647]: I20260123 17:25:52.657569 1647 main.cc:92] Flatcar Update Engine starting Jan 23 17:25:52.664281 jq[1677]: true Jan 23 17:25:52.678323 tar[1657]: linux-arm64/LICENSE Jan 23 17:25:52.678323 tar[1657]: linux-arm64/helm Jan 23 17:25:52.683735 dbus-daemon[1633]: [system] SELinux support is enabled Jan 23 17:25:52.684015 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 23 17:25:52.687537 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 23 17:25:52.687566 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 23 17:25:52.689318 extend-filesystems[1638]: Resized partition /dev/vda9 Jan 23 17:25:52.689014 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 23 17:25:52.691482 update_engine[1647]: I20260123 17:25:52.690454 1647 update_check_scheduler.cc:74] Next update check in 6m17s Jan 23 17:25:52.689030 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 23 17:25:52.697319 extend-filesystems[1693]: resize2fs 1.47.3 (8-Jul-2025) Jan 23 17:25:52.693652 systemd[1]: Started update-engine.service - Update Engine. Jan 23 17:25:52.698070 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 23 17:25:52.704337 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Jan 23 17:25:52.710881 systemd-logind[1644]: New seat seat0. Jan 23 17:25:52.712397 systemd-logind[1644]: Watching system buttons on /dev/input/event0 (Power Button) Jan 23 17:25:52.712413 systemd-logind[1644]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jan 23 17:25:52.712659 systemd[1]: Started systemd-logind.service - User Login Management. Jan 23 17:25:52.779784 locksmithd[1697]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 23 17:25:52.796766 bash[1703]: Updated "/home/core/.ssh/authorized_keys" Jan 23 17:25:52.798748 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 23 17:25:52.803613 systemd[1]: Starting sshkeys.service... Jan 23 17:25:52.830849 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 23 17:25:52.835124 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 23 17:25:52.836526 containerd[1675]: time="2026-01-23T17:25:52Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 23 17:25:52.838574 containerd[1675]: time="2026-01-23T17:25:52.838411160Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 23 17:25:52.857497 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 17:25:52.859343 containerd[1675]: time="2026-01-23T17:25:52.859129880Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.4µs" Jan 23 17:25:52.859343 containerd[1675]: time="2026-01-23T17:25:52.859169280Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 23 17:25:52.859343 containerd[1675]: time="2026-01-23T17:25:52.859256760Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 23 17:25:52.859343 containerd[1675]: time="2026-01-23T17:25:52.859271160Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 23 17:25:52.859496 containerd[1675]: time="2026-01-23T17:25:52.859470640Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 23 17:25:52.859520 containerd[1675]: time="2026-01-23T17:25:52.859497080Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 23 17:25:52.859640 containerd[1675]: time="2026-01-23T17:25:52.859616800Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 23 17:25:52.859640 containerd[1675]: time="2026-01-23T17:25:52.859637040Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 23 17:25:52.860034 containerd[1675]: time="2026-01-23T17:25:52.860008040Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 23 17:25:52.860065 containerd[1675]: time="2026-01-23T17:25:52.860033560Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 23 17:25:52.860065 containerd[1675]: time="2026-01-23T17:25:52.860046880Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 23 17:25:52.860065 containerd[1675]: time="2026-01-23T17:25:52.860056120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 23 17:25:52.860226 containerd[1675]: time="2026-01-23T17:25:52.860203920Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 23 17:25:52.860226 containerd[1675]: time="2026-01-23T17:25:52.860223520Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 23 17:25:52.860335 containerd[1675]: time="2026-01-23T17:25:52.860301680Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 23 17:25:52.860726 containerd[1675]: time="2026-01-23T17:25:52.860700920Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 23 17:25:52.860959 containerd[1675]: time="2026-01-23T17:25:52.860934640Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 23 17:25:52.860959 containerd[1675]: time="2026-01-23T17:25:52.860956200Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 23 17:25:52.861010 containerd[1675]: time="2026-01-23T17:25:52.860986840Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 23 17:25:52.861403 containerd[1675]: time="2026-01-23T17:25:52.861366520Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 23 17:25:52.861495 containerd[1675]: time="2026-01-23T17:25:52.861475560Z" level=info msg="metadata content store policy set" policy=shared Jan 23 17:25:52.888519 containerd[1675]: time="2026-01-23T17:25:52.888367840Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 23 17:25:52.888618 containerd[1675]: time="2026-01-23T17:25:52.888557440Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 23 17:25:52.888672 containerd[1675]: time="2026-01-23T17:25:52.888651080Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 23 17:25:52.888672 containerd[1675]: time="2026-01-23T17:25:52.888664560Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 23 17:25:52.888731 containerd[1675]: time="2026-01-23T17:25:52.888676840Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 23 17:25:52.888731 containerd[1675]: time="2026-01-23T17:25:52.888688400Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 23 17:25:52.888731 containerd[1675]: time="2026-01-23T17:25:52.888700040Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 23 17:25:52.888731 containerd[1675]: time="2026-01-23T17:25:52.888709480Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 23 17:25:52.888731 containerd[1675]: time="2026-01-23T17:25:52.888720320Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 23 17:25:52.888808 containerd[1675]: time="2026-01-23T17:25:52.888732440Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 23 17:25:52.888808 containerd[1675]: time="2026-01-23T17:25:52.888745480Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 23 17:25:52.888808 containerd[1675]: time="2026-01-23T17:25:52.888759120Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 23 17:25:52.888808 containerd[1675]: time="2026-01-23T17:25:52.888768080Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 23 17:25:52.888808 containerd[1675]: time="2026-01-23T17:25:52.888779480Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 23 17:25:52.889125 containerd[1675]: time="2026-01-23T17:25:52.888912320Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 23 17:25:52.889125 containerd[1675]: time="2026-01-23T17:25:52.888931400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 23 17:25:52.889125 containerd[1675]: time="2026-01-23T17:25:52.888945400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 23 17:25:52.889125 containerd[1675]: time="2026-01-23T17:25:52.888955080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 23 17:25:52.889125 containerd[1675]: time="2026-01-23T17:25:52.888964320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 23 17:25:52.889125 containerd[1675]: time="2026-01-23T17:25:52.888974480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 23 17:25:52.889125 containerd[1675]: time="2026-01-23T17:25:52.888985960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 23 17:25:52.889125 containerd[1675]: time="2026-01-23T17:25:52.888995520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 23 17:25:52.889125 containerd[1675]: time="2026-01-23T17:25:52.889005960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 23 17:25:52.889125 containerd[1675]: time="2026-01-23T17:25:52.889017920Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 23 17:25:52.889125 containerd[1675]: time="2026-01-23T17:25:52.889027720Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 23 17:25:52.889125 containerd[1675]: time="2026-01-23T17:25:52.889052080Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 23 17:25:52.889125 containerd[1675]: time="2026-01-23T17:25:52.889090320Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 23 17:25:52.889125 containerd[1675]: time="2026-01-23T17:25:52.889104480Z" level=info msg="Start snapshots syncer" Jan 23 17:25:52.889125 containerd[1675]: time="2026-01-23T17:25:52.889131280Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 23 17:25:52.890490 containerd[1675]: time="2026-01-23T17:25:52.889510400Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 23 17:25:52.890490 containerd[1675]: time="2026-01-23T17:25:52.889614160Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 23 17:25:52.890609 containerd[1675]: time="2026-01-23T17:25:52.889662400Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 23 17:25:52.890609 containerd[1675]: time="2026-01-23T17:25:52.889815520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 23 17:25:52.890609 containerd[1675]: time="2026-01-23T17:25:52.889838000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 23 17:25:52.890609 containerd[1675]: time="2026-01-23T17:25:52.889848920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 23 17:25:52.890609 containerd[1675]: time="2026-01-23T17:25:52.889861480Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 23 17:25:52.890609 containerd[1675]: time="2026-01-23T17:25:52.889872640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 23 17:25:52.890609 containerd[1675]: time="2026-01-23T17:25:52.889883320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 23 17:25:52.890609 containerd[1675]: time="2026-01-23T17:25:52.889893800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 23 17:25:52.890609 containerd[1675]: time="2026-01-23T17:25:52.889904480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 23 17:25:52.890609 containerd[1675]: time="2026-01-23T17:25:52.889914680Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 23 17:25:52.890609 containerd[1675]: time="2026-01-23T17:25:52.890010320Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 23 17:25:52.890609 containerd[1675]: time="2026-01-23T17:25:52.890028680Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 23 17:25:52.890609 containerd[1675]: time="2026-01-23T17:25:52.890038880Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 23 17:25:52.890817 containerd[1675]: time="2026-01-23T17:25:52.890048400Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 23 17:25:52.890817 containerd[1675]: time="2026-01-23T17:25:52.890056080Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 23 17:25:52.890817 containerd[1675]: time="2026-01-23T17:25:52.890065720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 23 17:25:52.890817 containerd[1675]: time="2026-01-23T17:25:52.890075440Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 23 17:25:52.890817 containerd[1675]: time="2026-01-23T17:25:52.890225920Z" level=info msg="runtime interface created" Jan 23 17:25:52.890817 containerd[1675]: time="2026-01-23T17:25:52.890236800Z" level=info msg="created NRI interface" Jan 23 17:25:52.890817 containerd[1675]: time="2026-01-23T17:25:52.890246560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 23 17:25:52.890817 containerd[1675]: time="2026-01-23T17:25:52.890258360Z" level=info msg="Connect containerd service" Jan 23 17:25:52.890817 containerd[1675]: time="2026-01-23T17:25:52.890281560Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 23 17:25:52.891423 containerd[1675]: time="2026-01-23T17:25:52.891299440Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 23 17:25:52.953335 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Jan 23 17:25:52.977326 extend-filesystems[1693]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 23 17:25:52.977326 extend-filesystems[1693]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 23 17:25:52.977326 extend-filesystems[1693]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Jan 23 17:25:52.980936 extend-filesystems[1638]: Resized filesystem in /dev/vda9 Jan 23 17:25:52.977766 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 23 17:25:52.980376 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 23 17:25:52.987373 containerd[1675]: time="2026-01-23T17:25:52.986738040Z" level=info msg="Start subscribing containerd event" Jan 23 17:25:52.987373 containerd[1675]: time="2026-01-23T17:25:52.986811080Z" level=info msg="Start recovering state" Jan 23 17:25:52.987373 containerd[1675]: time="2026-01-23T17:25:52.986908920Z" level=info msg="Start event monitor" Jan 23 17:25:52.987373 containerd[1675]: time="2026-01-23T17:25:52.986921200Z" level=info msg="Start cni network conf syncer for default" Jan 23 17:25:52.987373 containerd[1675]: time="2026-01-23T17:25:52.986933680Z" level=info msg="Start streaming server" Jan 23 17:25:52.987373 containerd[1675]: time="2026-01-23T17:25:52.986942880Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 23 17:25:52.987373 containerd[1675]: time="2026-01-23T17:25:52.986950880Z" level=info msg="runtime interface starting up..." Jan 23 17:25:52.987373 containerd[1675]: time="2026-01-23T17:25:52.986960240Z" level=info msg="starting plugins..." Jan 23 17:25:52.987373 containerd[1675]: time="2026-01-23T17:25:52.986974640Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 23 17:25:52.987373 containerd[1675]: time="2026-01-23T17:25:52.987231680Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 23 17:25:52.987373 containerd[1675]: time="2026-01-23T17:25:52.987284720Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 23 17:25:52.990615 containerd[1675]: time="2026-01-23T17:25:52.990587880Z" level=info msg="containerd successfully booted in 0.154416s" Jan 23 17:25:52.990763 systemd[1]: Started containerd.service - containerd container runtime. Jan 23 17:25:53.093383 tar[1657]: linux-arm64/README.md Jan 23 17:25:53.109795 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 23 17:25:53.318495 systemd-networkd[1585]: eth0: Gained IPv6LL Jan 23 17:25:53.321361 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 23 17:25:53.323220 systemd[1]: Reached target network-online.target - Network is Online. Jan 23 17:25:53.325739 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 17:25:53.328009 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 23 17:25:53.363369 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 23 17:25:53.616332 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 17:25:53.872507 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 17:25:53.932125 sshd_keygen[1672]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 23 17:25:53.953413 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 23 17:25:53.956566 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 23 17:25:53.974095 systemd[1]: issuegen.service: Deactivated successfully. Jan 23 17:25:53.975182 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 23 17:25:53.978990 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 23 17:25:53.995094 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 23 17:25:53.998997 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 23 17:25:54.003175 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jan 23 17:25:54.005045 systemd[1]: Reached target getty.target - Login Prompts. Jan 23 17:25:54.297470 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 17:25:54.301844 (kubelet)[1775]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 17:25:54.787759 kubelet[1775]: E0123 17:25:54.787666 1775 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 17:25:54.790043 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 17:25:54.790190 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 17:25:54.790871 systemd[1]: kubelet.service: Consumed 721ms CPU time, 249.5M memory peak. Jan 23 17:25:55.627389 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 17:25:55.884344 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 17:25:59.635362 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 17:25:59.644645 coreos-metadata[1632]: Jan 23 17:25:59.644 WARN failed to locate config-drive, using the metadata service API instead Jan 23 17:25:59.659351 coreos-metadata[1632]: Jan 23 17:25:59.659 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 23 17:25:59.899363 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 23 17:25:59.904301 coreos-metadata[1716]: Jan 23 17:25:59.904 WARN failed to locate config-drive, using the metadata service API instead Jan 23 17:25:59.917539 coreos-metadata[1716]: Jan 23 17:25:59.917 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 23 17:26:01.495059 coreos-metadata[1632]: Jan 23 17:26:01.494 INFO Fetch successful Jan 23 17:26:01.495059 coreos-metadata[1632]: Jan 23 17:26:01.495 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 23 17:26:02.198169 coreos-metadata[1716]: Jan 23 17:26:02.198 INFO Fetch successful Jan 23 17:26:02.198169 coreos-metadata[1716]: Jan 23 17:26:02.198 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 23 17:26:02.801208 coreos-metadata[1632]: Jan 23 17:26:02.801 INFO Fetch successful Jan 23 17:26:02.801208 coreos-metadata[1632]: Jan 23 17:26:02.801 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 23 17:26:03.470364 coreos-metadata[1716]: Jan 23 17:26:03.470 INFO Fetch successful Jan 23 17:26:03.476107 coreos-metadata[1632]: Jan 23 17:26:03.476 INFO Fetch successful Jan 23 17:26:03.476107 coreos-metadata[1632]: Jan 23 17:26:03.476 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 23 17:26:03.478218 unknown[1716]: wrote ssh authorized keys file for user: core Jan 23 17:26:03.518901 update-ssh-keys[1794]: Updated "/home/core/.ssh/authorized_keys" Jan 23 17:26:03.519941 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 23 17:26:03.521262 systemd[1]: Finished sshkeys.service. Jan 23 17:26:04.125893 coreos-metadata[1632]: Jan 23 17:26:04.125 INFO Fetch successful Jan 23 17:26:04.125893 coreos-metadata[1632]: Jan 23 17:26:04.125 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 23 17:26:04.894265 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 23 17:26:04.895809 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 17:26:05.040710 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 17:26:05.044836 (kubelet)[1805]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 17:26:05.077224 kubelet[1805]: E0123 17:26:05.077148 1805 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 17:26:05.079998 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 17:26:05.080116 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 17:26:05.080490 systemd[1]: kubelet.service: Consumed 144ms CPU time, 107.1M memory peak. Jan 23 17:26:07.591847 coreos-metadata[1632]: Jan 23 17:26:07.591 INFO Fetch successful Jan 23 17:26:07.591847 coreos-metadata[1632]: Jan 23 17:26:07.591 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 23 17:26:10.130818 coreos-metadata[1632]: Jan 23 17:26:10.130 INFO Fetch successful Jan 23 17:26:10.170425 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 23 17:26:10.171055 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 23 17:26:10.171186 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 23 17:26:10.171369 systemd[1]: Startup finished in 2.386s (kernel) + 17.022s (initrd) + 20.068s (userspace) = 39.478s. Jan 23 17:26:11.951082 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 23 17:26:11.952518 systemd[1]: Started sshd@0-10.0.1.37:22-4.153.228.146:56676.service - OpenSSH per-connection server daemon (4.153.228.146:56676). Jan 23 17:26:12.502644 sshd[1820]: Accepted publickey for core from 4.153.228.146 port 56676 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:26:12.504469 sshd-session[1820]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:26:12.510865 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 23 17:26:12.511885 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 23 17:26:12.516544 systemd-logind[1644]: New session 1 of user core. Jan 23 17:26:12.547014 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 23 17:26:12.550697 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 23 17:26:12.581839 (systemd)[1826]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:26:12.584417 systemd-logind[1644]: New session 2 of user core. Jan 23 17:26:12.703296 systemd[1826]: Queued start job for default target default.target. Jan 23 17:26:12.724690 systemd[1826]: Created slice app.slice - User Application Slice. Jan 23 17:26:12.724729 systemd[1826]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 23 17:26:12.724741 systemd[1826]: Reached target paths.target - Paths. Jan 23 17:26:12.724794 systemd[1826]: Reached target timers.target - Timers. Jan 23 17:26:12.725988 systemd[1826]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 23 17:26:12.726809 systemd[1826]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 23 17:26:12.736123 systemd[1826]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 23 17:26:12.738367 systemd[1826]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 23 17:26:12.738561 systemd[1826]: Reached target sockets.target - Sockets. Jan 23 17:26:12.738612 systemd[1826]: Reached target basic.target - Basic System. Jan 23 17:26:12.738641 systemd[1826]: Reached target default.target - Main User Target. Jan 23 17:26:12.738665 systemd[1826]: Startup finished in 149ms. Jan 23 17:26:12.739036 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 23 17:26:12.740378 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 23 17:26:13.046680 systemd[1]: Started sshd@1-10.0.1.37:22-4.153.228.146:56690.service - OpenSSH per-connection server daemon (4.153.228.146:56690). Jan 23 17:26:13.589744 sshd[1840]: Accepted publickey for core from 4.153.228.146 port 56690 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:26:13.591084 sshd-session[1840]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:26:13.594872 systemd-logind[1644]: New session 3 of user core. Jan 23 17:26:13.609686 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 23 17:26:13.889639 sshd[1844]: Connection closed by 4.153.228.146 port 56690 Jan 23 17:26:13.889894 sshd-session[1840]: pam_unix(sshd:session): session closed for user core Jan 23 17:26:13.894074 systemd[1]: sshd@1-10.0.1.37:22-4.153.228.146:56690.service: Deactivated successfully. Jan 23 17:26:13.895671 systemd[1]: session-3.scope: Deactivated successfully. Jan 23 17:26:13.896352 systemd-logind[1644]: Session 3 logged out. Waiting for processes to exit. Jan 23 17:26:13.897247 systemd-logind[1644]: Removed session 3. Jan 23 17:26:13.998385 systemd[1]: Started sshd@2-10.0.1.37:22-4.153.228.146:56706.service - OpenSSH per-connection server daemon (4.153.228.146:56706). Jan 23 17:26:14.535067 sshd[1850]: Accepted publickey for core from 4.153.228.146 port 56706 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:26:14.536272 sshd-session[1850]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:26:14.540217 systemd-logind[1644]: New session 4 of user core. Jan 23 17:26:14.546657 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 23 17:26:14.821776 sshd[1854]: Connection closed by 4.153.228.146 port 56706 Jan 23 17:26:14.822060 sshd-session[1850]: pam_unix(sshd:session): session closed for user core Jan 23 17:26:14.825992 systemd[1]: sshd@2-10.0.1.37:22-4.153.228.146:56706.service: Deactivated successfully. Jan 23 17:26:14.827567 systemd[1]: session-4.scope: Deactivated successfully. Jan 23 17:26:14.828349 systemd-logind[1644]: Session 4 logged out. Waiting for processes to exit. Jan 23 17:26:14.829388 systemd-logind[1644]: Removed session 4. Jan 23 17:26:14.930712 systemd[1]: Started sshd@3-10.0.1.37:22-4.153.228.146:57958.service - OpenSSH per-connection server daemon (4.153.228.146:57958). Jan 23 17:26:15.144147 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 23 17:26:15.145595 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 17:26:15.274810 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 17:26:15.279157 (kubelet)[1871]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 17:26:15.310370 kubelet[1871]: E0123 17:26:15.310319 1871 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 17:26:15.312729 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 17:26:15.312857 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 17:26:15.314405 systemd[1]: kubelet.service: Consumed 136ms CPU time, 106.8M memory peak. Jan 23 17:26:15.465504 sshd[1860]: Accepted publickey for core from 4.153.228.146 port 57958 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:26:15.466756 sshd-session[1860]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:26:15.470928 systemd-logind[1644]: New session 5 of user core. Jan 23 17:26:15.483716 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 23 17:26:15.756480 sshd[1881]: Connection closed by 4.153.228.146 port 57958 Jan 23 17:26:15.756833 sshd-session[1860]: pam_unix(sshd:session): session closed for user core Jan 23 17:26:15.760989 systemd[1]: sshd@3-10.0.1.37:22-4.153.228.146:57958.service: Deactivated successfully. Jan 23 17:26:15.762676 systemd[1]: session-5.scope: Deactivated successfully. Jan 23 17:26:15.765019 systemd-logind[1644]: Session 5 logged out. Waiting for processes to exit. Jan 23 17:26:15.766028 systemd-logind[1644]: Removed session 5. Jan 23 17:26:15.866594 systemd[1]: Started sshd@4-10.0.1.37:22-4.153.228.146:57974.service - OpenSSH per-connection server daemon (4.153.228.146:57974). Jan 23 17:26:16.380087 sshd[1887]: Accepted publickey for core from 4.153.228.146 port 57974 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:26:16.381466 sshd-session[1887]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:26:16.385385 systemd-logind[1644]: New session 6 of user core. Jan 23 17:26:16.398663 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 23 17:26:16.432080 chronyd[1630]: Selected source PHC0 Jan 23 17:26:16.599435 sudo[1892]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 23 17:26:16.599724 sudo[1892]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 17:26:16.611718 sudo[1892]: pam_unix(sudo:session): session closed for user root Jan 23 17:26:16.715605 sshd[1891]: Connection closed by 4.153.228.146 port 57974 Jan 23 17:26:16.716197 sshd-session[1887]: pam_unix(sshd:session): session closed for user core Jan 23 17:26:16.721905 systemd[1]: sshd@4-10.0.1.37:22-4.153.228.146:57974.service: Deactivated successfully. Jan 23 17:26:16.724373 systemd[1]: session-6.scope: Deactivated successfully. Jan 23 17:26:16.725830 systemd-logind[1644]: Session 6 logged out. Waiting for processes to exit. Jan 23 17:26:16.727507 systemd-logind[1644]: Removed session 6. Jan 23 17:26:16.842493 systemd[1]: Started sshd@5-10.0.1.37:22-4.153.228.146:57976.service - OpenSSH per-connection server daemon (4.153.228.146:57976). Jan 23 17:26:17.413993 sshd[1899]: Accepted publickey for core from 4.153.228.146 port 57976 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:26:17.415446 sshd-session[1899]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:26:17.419697 systemd-logind[1644]: New session 7 of user core. Jan 23 17:26:17.431722 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 23 17:26:17.626897 sudo[1905]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 23 17:26:17.627169 sudo[1905]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 17:26:17.629996 sudo[1905]: pam_unix(sudo:session): session closed for user root Jan 23 17:26:17.636140 sudo[1904]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 23 17:26:17.636446 sudo[1904]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 17:26:17.644309 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 23 17:26:17.687000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 23 17:26:17.689984 kernel: kauditd_printk_skb: 186 callbacks suppressed Jan 23 17:26:17.690045 kernel: audit: type=1305 audit(1769189177.687:230): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 23 17:26:17.690949 augenrules[1929]: No rules Jan 23 17:26:17.687000 audit[1929]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffff677fb0 a2=420 a3=0 items=0 ppid=1910 pid=1929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:17.692345 systemd[1]: audit-rules.service: Deactivated successfully. Jan 23 17:26:17.692908 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 23 17:26:17.694349 sudo[1904]: pam_unix(sudo:session): session closed for user root Jan 23 17:26:17.695365 kernel: audit: type=1300 audit(1769189177.687:230): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffff677fb0 a2=420 a3=0 items=0 ppid=1910 pid=1929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:17.695423 kernel: audit: type=1327 audit(1769189177.687:230): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 23 17:26:17.687000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 23 17:26:17.692000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:17.699005 kernel: audit: type=1130 audit(1769189177.692:231): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:17.694000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:17.701363 kernel: audit: type=1131 audit(1769189177.694:232): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:17.701410 kernel: audit: type=1106 audit(1769189177.694:233): pid=1904 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 17:26:17.694000 audit[1904]: USER_END pid=1904 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 17:26:17.703789 kernel: audit: type=1104 audit(1769189177.694:234): pid=1904 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 17:26:17.694000 audit[1904]: CRED_DISP pid=1904 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 17:26:17.798760 sshd[1903]: Connection closed by 4.153.228.146 port 57976 Jan 23 17:26:17.799573 sshd-session[1899]: pam_unix(sshd:session): session closed for user core Jan 23 17:26:17.800000 audit[1899]: USER_END pid=1899 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:26:17.800000 audit[1899]: CRED_DISP pid=1899 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:26:17.804602 systemd-logind[1644]: Session 7 logged out. Waiting for processes to exit. Jan 23 17:26:17.805257 systemd[1]: sshd@5-10.0.1.37:22-4.153.228.146:57976.service: Deactivated successfully. Jan 23 17:26:17.808130 kernel: audit: type=1106 audit(1769189177.800:235): pid=1899 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:26:17.808209 kernel: audit: type=1104 audit(1769189177.800:236): pid=1899 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:26:17.808238 kernel: audit: type=1131 audit(1769189177.806:237): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-10.0.1.37:22-4.153.228.146:57976 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:17.806000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-10.0.1.37:22-4.153.228.146:57976 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:17.809509 systemd[1]: session-7.scope: Deactivated successfully. Jan 23 17:26:17.811923 systemd-logind[1644]: Removed session 7. Jan 23 17:26:17.915563 systemd[1]: Started sshd@6-10.0.1.37:22-4.153.228.146:57988.service - OpenSSH per-connection server daemon (4.153.228.146:57988). Jan 23 17:26:17.915000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.1.37:22-4.153.228.146:57988 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:18.490000 audit[1938]: USER_ACCT pid=1938 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:26:18.491591 sshd[1938]: Accepted publickey for core from 4.153.228.146 port 57988 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:26:18.491000 audit[1938]: CRED_ACQ pid=1938 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:26:18.491000 audit[1938]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdff150f0 a2=3 a3=0 items=0 ppid=1 pid=1938 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:18.491000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:26:18.492928 sshd-session[1938]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:26:18.497246 systemd-logind[1644]: New session 8 of user core. Jan 23 17:26:18.508626 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 23 17:26:18.510000 audit[1938]: USER_START pid=1938 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:26:18.512000 audit[1942]: CRED_ACQ pid=1942 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:26:18.702000 audit[1943]: USER_ACCT pid=1943 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 17:26:18.704002 sudo[1943]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 23 17:26:18.702000 audit[1943]: CRED_REFR pid=1943 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 17:26:18.702000 audit[1943]: USER_START pid=1943 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 17:26:18.704284 sudo[1943]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 23 17:26:19.131674 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 23 17:26:19.145718 (dockerd)[1964]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 23 17:26:19.469918 dockerd[1964]: time="2026-01-23T17:26:19.469853868Z" level=info msg="Starting up" Jan 23 17:26:19.471382 dockerd[1964]: time="2026-01-23T17:26:19.471337435Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 23 17:26:19.482742 dockerd[1964]: time="2026-01-23T17:26:19.482699144Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 23 17:26:19.527342 systemd[1]: var-lib-docker-metacopy\x2dcheck3396484391-merged.mount: Deactivated successfully. Jan 23 17:26:19.542131 dockerd[1964]: time="2026-01-23T17:26:19.542091669Z" level=info msg="Loading containers: start." Jan 23 17:26:19.556402 kernel: Initializing XFRM netlink socket Jan 23 17:26:19.610000 audit[2015]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2015 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.610000 audit[2015]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffc680fb10 a2=0 a3=0 items=0 ppid=1964 pid=2015 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.610000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 23 17:26:19.612000 audit[2017]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2017 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.612000 audit[2017]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffc90bf7b0 a2=0 a3=0 items=0 ppid=1964 pid=2017 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.612000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 23 17:26:19.614000 audit[2019]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2019 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.614000 audit[2019]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc93d24f0 a2=0 a3=0 items=0 ppid=1964 pid=2019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.614000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 23 17:26:19.617000 audit[2021]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2021 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.617000 audit[2021]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffecb8f360 a2=0 a3=0 items=0 ppid=1964 pid=2021 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.617000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 23 17:26:19.619000 audit[2023]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2023 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.619000 audit[2023]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffeecfa950 a2=0 a3=0 items=0 ppid=1964 pid=2023 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.619000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 23 17:26:19.621000 audit[2025]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2025 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.621000 audit[2025]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffe1f012c0 a2=0 a3=0 items=0 ppid=1964 pid=2025 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.621000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 23 17:26:19.622000 audit[2027]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2027 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.622000 audit[2027]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffe2f16110 a2=0 a3=0 items=0 ppid=1964 pid=2027 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.622000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 23 17:26:19.624000 audit[2029]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2029 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.624000 audit[2029]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffe8ca2a80 a2=0 a3=0 items=0 ppid=1964 pid=2029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.624000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 23 17:26:19.656000 audit[2032]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2032 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.656000 audit[2032]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=472 a0=3 a1=fffff9c8d770 a2=0 a3=0 items=0 ppid=1964 pid=2032 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.656000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 23 17:26:19.658000 audit[2034]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2034 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.658000 audit[2034]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffdc825870 a2=0 a3=0 items=0 ppid=1964 pid=2034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.658000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 23 17:26:19.660000 audit[2036]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2036 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.660000 audit[2036]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffd8b81900 a2=0 a3=0 items=0 ppid=1964 pid=2036 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.660000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 23 17:26:19.661000 audit[2038]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2038 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.661000 audit[2038]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffed8ec1c0 a2=0 a3=0 items=0 ppid=1964 pid=2038 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.661000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 23 17:26:19.663000 audit[2040]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2040 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.663000 audit[2040]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffd60390f0 a2=0 a3=0 items=0 ppid=1964 pid=2040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.663000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 23 17:26:19.701000 audit[2070]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2070 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:19.701000 audit[2070]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=fffff400ff70 a2=0 a3=0 items=0 ppid=1964 pid=2070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.701000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 23 17:26:19.703000 audit[2072]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2072 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:19.703000 audit[2072]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=fffff76cbd30 a2=0 a3=0 items=0 ppid=1964 pid=2072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.703000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 23 17:26:19.705000 audit[2074]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2074 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:19.705000 audit[2074]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc6104610 a2=0 a3=0 items=0 ppid=1964 pid=2074 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.705000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 23 17:26:19.708000 audit[2076]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2076 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:19.708000 audit[2076]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd99d6160 a2=0 a3=0 items=0 ppid=1964 pid=2076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.708000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 23 17:26:19.710000 audit[2078]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2078 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:19.710000 audit[2078]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc1bae250 a2=0 a3=0 items=0 ppid=1964 pid=2078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.710000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 23 17:26:19.712000 audit[2080]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2080 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:19.712000 audit[2080]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffe331dff0 a2=0 a3=0 items=0 ppid=1964 pid=2080 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.712000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 23 17:26:19.712000 audit[2082]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2082 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:19.712000 audit[2082]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffe02d0c70 a2=0 a3=0 items=0 ppid=1964 pid=2082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.712000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 23 17:26:19.715000 audit[2084]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2084 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:19.715000 audit[2084]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffc9cbd360 a2=0 a3=0 items=0 ppid=1964 pid=2084 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.715000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 23 17:26:19.717000 audit[2086]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2086 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:19.717000 audit[2086]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=484 a0=3 a1=ffffe0bee750 a2=0 a3=0 items=0 ppid=1964 pid=2086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.717000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 23 17:26:19.719000 audit[2088]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2088 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:19.719000 audit[2088]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=fffffa3b1900 a2=0 a3=0 items=0 ppid=1964 pid=2088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.719000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 23 17:26:19.722000 audit[2090]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2090 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:19.722000 audit[2090]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffd5599350 a2=0 a3=0 items=0 ppid=1964 pid=2090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.722000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 23 17:26:19.725000 audit[2092]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2092 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:19.725000 audit[2092]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffda749560 a2=0 a3=0 items=0 ppid=1964 pid=2092 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.725000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 23 17:26:19.726000 audit[2094]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2094 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:19.726000 audit[2094]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffdd8bd0a0 a2=0 a3=0 items=0 ppid=1964 pid=2094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.726000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 23 17:26:19.732000 audit[2099]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2099 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.732000 audit[2099]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffffd7054e0 a2=0 a3=0 items=0 ppid=1964 pid=2099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.732000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 23 17:26:19.736000 audit[2101]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2101 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.736000 audit[2101]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=fffff6f90480 a2=0 a3=0 items=0 ppid=1964 pid=2101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.736000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 23 17:26:19.738000 audit[2103]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2103 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.738000 audit[2103]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffd3096d00 a2=0 a3=0 items=0 ppid=1964 pid=2103 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.738000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 23 17:26:19.739000 audit[2105]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2105 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:19.739000 audit[2105]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc9909210 a2=0 a3=0 items=0 ppid=1964 pid=2105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.739000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 23 17:26:19.741000 audit[2107]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2107 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:19.741000 audit[2107]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffe66d7140 a2=0 a3=0 items=0 ppid=1964 pid=2107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.741000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 23 17:26:19.743000 audit[2109]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2109 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:19.743000 audit[2109]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=fffff9aa2330 a2=0 a3=0 items=0 ppid=1964 pid=2109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.743000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 23 17:26:19.761000 audit[2114]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2114 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.761000 audit[2114]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=520 a0=3 a1=ffffce37dd30 a2=0 a3=0 items=0 ppid=1964 pid=2114 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.761000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 23 17:26:19.763000 audit[2116]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2116 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.763000 audit[2116]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=ffffdeb70960 a2=0 a3=0 items=0 ppid=1964 pid=2116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.763000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 23 17:26:19.771000 audit[2124]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2124 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.771000 audit[2124]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=300 a0=3 a1=ffffc8549750 a2=0 a3=0 items=0 ppid=1964 pid=2124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.771000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 23 17:26:19.786000 audit[2130]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2130 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.786000 audit[2130]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=ffffd86a8610 a2=0 a3=0 items=0 ppid=1964 pid=2130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.786000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 23 17:26:19.788000 audit[2132]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2132 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.788000 audit[2132]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=512 a0=3 a1=ffffef9ce050 a2=0 a3=0 items=0 ppid=1964 pid=2132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.788000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 23 17:26:19.790000 audit[2134]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2134 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.790000 audit[2134]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffe3d4c510 a2=0 a3=0 items=0 ppid=1964 pid=2134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.790000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 23 17:26:19.791000 audit[2136]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2136 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.791000 audit[2136]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=ffffed9517a0 a2=0 a3=0 items=0 ppid=1964 pid=2136 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.791000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 23 17:26:19.793000 audit[2138]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2138 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:19.793000 audit[2138]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffea32bf80 a2=0 a3=0 items=0 ppid=1964 pid=2138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:19.793000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 23 17:26:19.795740 systemd-networkd[1585]: docker0: Link UP Jan 23 17:26:19.799883 dockerd[1964]: time="2026-01-23T17:26:19.799835093Z" level=info msg="Loading containers: done." Jan 23 17:26:19.812686 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3262641105-merged.mount: Deactivated successfully. Jan 23 17:26:19.825781 dockerd[1964]: time="2026-01-23T17:26:19.825453452Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 23 17:26:19.825781 dockerd[1964]: time="2026-01-23T17:26:19.825544236Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 23 17:26:19.825781 dockerd[1964]: time="2026-01-23T17:26:19.825729270Z" level=info msg="Initializing buildkit" Jan 23 17:26:19.850063 dockerd[1964]: time="2026-01-23T17:26:19.850016813Z" level=info msg="Completed buildkit initialization" Jan 23 17:26:19.855330 dockerd[1964]: time="2026-01-23T17:26:19.855254625Z" level=info msg="Daemon has completed initialization" Jan 23 17:26:19.855616 dockerd[1964]: time="2026-01-23T17:26:19.855494520Z" level=info msg="API listen on /run/docker.sock" Jan 23 17:26:19.855676 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 23 17:26:19.854000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:20.957097 containerd[1675]: time="2026-01-23T17:26:20.957010008Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\"" Jan 23 17:26:21.607417 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1796763012.mount: Deactivated successfully. Jan 23 17:26:22.201264 containerd[1675]: time="2026-01-23T17:26:22.201173684Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:22.202350 containerd[1675]: time="2026-01-23T17:26:22.202289450Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.3: active requests=0, bytes read=22974850" Jan 23 17:26:22.203357 containerd[1675]: time="2026-01-23T17:26:22.203328815Z" level=info msg="ImageCreate event name:\"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:22.206265 containerd[1675]: time="2026-01-23T17:26:22.206232069Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:22.207155 containerd[1675]: time="2026-01-23T17:26:22.207126313Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.3\" with image id \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\", size \"24567639\" in 1.250073507s" Jan 23 17:26:22.207199 containerd[1675]: time="2026-01-23T17:26:22.207161954Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\" returns image reference \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\"" Jan 23 17:26:22.207704 containerd[1675]: time="2026-01-23T17:26:22.207625996Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\"" Jan 23 17:26:23.223747 containerd[1675]: time="2026-01-23T17:26:23.223689700Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:23.226119 containerd[1675]: time="2026-01-23T17:26:23.225860111Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.3: active requests=0, bytes read=19127323" Jan 23 17:26:23.227836 containerd[1675]: time="2026-01-23T17:26:23.227798400Z" level=info msg="ImageCreate event name:\"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:23.230190 containerd[1675]: time="2026-01-23T17:26:23.230138372Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:23.232031 containerd[1675]: time="2026-01-23T17:26:23.231995941Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.3\" with image id \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\", size \"20719958\" in 1.024339985s" Jan 23 17:26:23.232382 containerd[1675]: time="2026-01-23T17:26:23.232150862Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\" returns image reference \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\"" Jan 23 17:26:23.232597 containerd[1675]: time="2026-01-23T17:26:23.232552224Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\"" Jan 23 17:26:24.095178 containerd[1675]: time="2026-01-23T17:26:24.095121895Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:24.097066 containerd[1675]: time="2026-01-23T17:26:24.097023384Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.3: active requests=0, bytes read=0" Jan 23 17:26:24.098179 containerd[1675]: time="2026-01-23T17:26:24.098144030Z" level=info msg="ImageCreate event name:\"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:24.101513 containerd[1675]: time="2026-01-23T17:26:24.101452366Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:24.103044 containerd[1675]: time="2026-01-23T17:26:24.103018454Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.3\" with image id \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\", size \"15776215\" in 870.42019ms" Jan 23 17:26:24.103295 containerd[1675]: time="2026-01-23T17:26:24.103250535Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\" returns image reference \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\"" Jan 23 17:26:24.103858 containerd[1675]: time="2026-01-23T17:26:24.103799938Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\"" Jan 23 17:26:24.953942 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1068980446.mount: Deactivated successfully. Jan 23 17:26:25.114580 containerd[1675]: time="2026-01-23T17:26:25.114538329Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:25.115851 containerd[1675]: time="2026-01-23T17:26:25.115805815Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.3: active requests=0, bytes read=0" Jan 23 17:26:25.117335 containerd[1675]: time="2026-01-23T17:26:25.116833979Z" level=info msg="ImageCreate event name:\"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:25.119399 containerd[1675]: time="2026-01-23T17:26:25.119372950Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:25.120298 containerd[1675]: time="2026-01-23T17:26:25.120273873Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.3\" with image id \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\", repo tag \"registry.k8s.io/kube-proxy:v1.34.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\", size \"22804272\" in 1.016445695s" Jan 23 17:26:25.120432 containerd[1675]: time="2026-01-23T17:26:25.120415314Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\" returns image reference \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\"" Jan 23 17:26:25.120974 containerd[1675]: time="2026-01-23T17:26:25.120945996Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Jan 23 17:26:25.393830 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 23 17:26:25.395186 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 17:26:25.539335 kernel: kauditd_printk_skb: 132 callbacks suppressed Jan 23 17:26:25.539422 kernel: audit: type=1130 audit(1769189185.536:288): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:25.536000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:25.537900 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 17:26:25.541793 (kubelet)[2262]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 23 17:26:25.577176 kubelet[2262]: E0123 17:26:25.577083 2262 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 23 17:26:25.579183 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 23 17:26:25.579298 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 23 17:26:25.578000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 23 17:26:25.579701 systemd[1]: kubelet.service: Consumed 137ms CPU time, 107.2M memory peak. Jan 23 17:26:25.582337 kernel: audit: type=1131 audit(1769189185.578:289): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 23 17:26:25.663878 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2319334115.mount: Deactivated successfully. Jan 23 17:26:26.190140 containerd[1675]: time="2026-01-23T17:26:26.189482222Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:26.191493 containerd[1675]: time="2026-01-23T17:26:26.191438272Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=19576236" Jan 23 17:26:26.193007 containerd[1675]: time="2026-01-23T17:26:26.192952919Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:26.195764 containerd[1675]: time="2026-01-23T17:26:26.195737973Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:26.196696 containerd[1675]: time="2026-01-23T17:26:26.196663138Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 1.075659742s" Jan 23 17:26:26.196696 containerd[1675]: time="2026-01-23T17:26:26.196695858Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Jan 23 17:26:26.197282 containerd[1675]: time="2026-01-23T17:26:26.197217380Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Jan 23 17:26:26.899948 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3314767295.mount: Deactivated successfully. Jan 23 17:26:26.908070 containerd[1675]: time="2026-01-23T17:26:26.907990507Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:26.909285 containerd[1675]: time="2026-01-23T17:26:26.909059552Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=0" Jan 23 17:26:26.910141 containerd[1675]: time="2026-01-23T17:26:26.910097557Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:26.912444 containerd[1675]: time="2026-01-23T17:26:26.912412569Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:26.913136 containerd[1675]: time="2026-01-23T17:26:26.913114612Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 715.826871ms" Jan 23 17:26:26.913194 containerd[1675]: time="2026-01-23T17:26:26.913140132Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Jan 23 17:26:26.913573 containerd[1675]: time="2026-01-23T17:26:26.913552494Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Jan 23 17:26:27.567962 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2487468356.mount: Deactivated successfully. Jan 23 17:26:29.325944 containerd[1675]: time="2026-01-23T17:26:29.325880928Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:29.326524 containerd[1675]: time="2026-01-23T17:26:29.326468691Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=85821047" Jan 23 17:26:29.327881 containerd[1675]: time="2026-01-23T17:26:29.327840618Z" level=info msg="ImageCreate event name:\"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:29.330490 containerd[1675]: time="2026-01-23T17:26:29.330455430Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:29.331496 containerd[1675]: time="2026-01-23T17:26:29.331452435Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"98207481\" in 2.417870661s" Jan 23 17:26:29.331496 containerd[1675]: time="2026-01-23T17:26:29.331479315Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\"" Jan 23 17:26:34.999813 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 17:26:34.999966 systemd[1]: kubelet.service: Consumed 137ms CPU time, 107.2M memory peak. Jan 23 17:26:34.998000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:35.001884 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 17:26:34.998000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:35.005131 kernel: audit: type=1130 audit(1769189194.998:290): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:35.005191 kernel: audit: type=1131 audit(1769189194.998:291): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:35.028469 systemd[1]: Reload requested from client PID 2411 ('systemctl') (unit session-8.scope)... Jan 23 17:26:35.028607 systemd[1]: Reloading... Jan 23 17:26:35.119440 zram_generator::config[2463]: No configuration found. Jan 23 17:26:35.286220 systemd[1]: Reloading finished in 257 ms. Jan 23 17:26:35.319642 kernel: audit: type=1334 audit(1769189195.316:292): prog-id=63 op=LOAD Jan 23 17:26:35.319724 kernel: audit: type=1334 audit(1769189195.316:293): prog-id=59 op=UNLOAD Jan 23 17:26:35.319744 kernel: audit: type=1334 audit(1769189195.317:294): prog-id=64 op=LOAD Jan 23 17:26:35.319766 kernel: audit: type=1334 audit(1769189195.317:295): prog-id=43 op=UNLOAD Jan 23 17:26:35.316000 audit: BPF prog-id=63 op=LOAD Jan 23 17:26:35.316000 audit: BPF prog-id=59 op=UNLOAD Jan 23 17:26:35.317000 audit: BPF prog-id=64 op=LOAD Jan 23 17:26:35.317000 audit: BPF prog-id=43 op=UNLOAD Jan 23 17:26:35.318000 audit: BPF prog-id=65 op=LOAD Jan 23 17:26:35.320971 kernel: audit: type=1334 audit(1769189195.318:296): prog-id=65 op=LOAD Jan 23 17:26:35.325000 audit: BPF prog-id=66 op=LOAD Jan 23 17:26:35.325000 audit: BPF prog-id=44 op=UNLOAD Jan 23 17:26:35.325000 audit: BPF prog-id=45 op=UNLOAD Jan 23 17:26:35.329292 kernel: audit: type=1334 audit(1769189195.325:297): prog-id=66 op=LOAD Jan 23 17:26:35.329370 kernel: audit: type=1334 audit(1769189195.325:298): prog-id=44 op=UNLOAD Jan 23 17:26:35.329398 kernel: audit: type=1334 audit(1769189195.325:299): prog-id=45 op=UNLOAD Jan 23 17:26:35.326000 audit: BPF prog-id=67 op=LOAD Jan 23 17:26:35.326000 audit: BPF prog-id=46 op=UNLOAD Jan 23 17:26:35.328000 audit: BPF prog-id=68 op=LOAD Jan 23 17:26:35.328000 audit: BPF prog-id=69 op=LOAD Jan 23 17:26:35.328000 audit: BPF prog-id=53 op=UNLOAD Jan 23 17:26:35.328000 audit: BPF prog-id=54 op=UNLOAD Jan 23 17:26:35.329000 audit: BPF prog-id=70 op=LOAD Jan 23 17:26:35.329000 audit: BPF prog-id=58 op=UNLOAD Jan 23 17:26:35.329000 audit: BPF prog-id=71 op=LOAD Jan 23 17:26:35.329000 audit: BPF prog-id=50 op=UNLOAD Jan 23 17:26:35.329000 audit: BPF prog-id=72 op=LOAD Jan 23 17:26:35.329000 audit: BPF prog-id=73 op=LOAD Jan 23 17:26:35.329000 audit: BPF prog-id=51 op=UNLOAD Jan 23 17:26:35.329000 audit: BPF prog-id=52 op=UNLOAD Jan 23 17:26:35.330000 audit: BPF prog-id=74 op=LOAD Jan 23 17:26:35.330000 audit: BPF prog-id=47 op=UNLOAD Jan 23 17:26:35.330000 audit: BPF prog-id=75 op=LOAD Jan 23 17:26:35.330000 audit: BPF prog-id=76 op=LOAD Jan 23 17:26:35.330000 audit: BPF prog-id=48 op=UNLOAD Jan 23 17:26:35.330000 audit: BPF prog-id=49 op=UNLOAD Jan 23 17:26:35.330000 audit: BPF prog-id=77 op=LOAD Jan 23 17:26:35.330000 audit: BPF prog-id=55 op=UNLOAD Jan 23 17:26:35.330000 audit: BPF prog-id=78 op=LOAD Jan 23 17:26:35.330000 audit: BPF prog-id=79 op=LOAD Jan 23 17:26:35.330000 audit: BPF prog-id=56 op=UNLOAD Jan 23 17:26:35.330000 audit: BPF prog-id=57 op=UNLOAD Jan 23 17:26:35.331000 audit: BPF prog-id=80 op=LOAD Jan 23 17:26:35.331000 audit: BPF prog-id=60 op=UNLOAD Jan 23 17:26:35.331000 audit: BPF prog-id=81 op=LOAD Jan 23 17:26:35.331000 audit: BPF prog-id=82 op=LOAD Jan 23 17:26:35.332000 audit: BPF prog-id=61 op=UNLOAD Jan 23 17:26:35.332000 audit: BPF prog-id=62 op=UNLOAD Jan 23 17:26:35.361356 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 23 17:26:35.361565 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 23 17:26:35.361901 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 17:26:35.360000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 23 17:26:35.361960 systemd[1]: kubelet.service: Consumed 92ms CPU time, 95M memory peak. Jan 23 17:26:35.363429 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 17:26:35.476000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:35.477241 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 17:26:35.490754 (kubelet)[2505]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 23 17:26:35.522602 kubelet[2505]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 23 17:26:35.522602 kubelet[2505]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 17:26:35.523442 kubelet[2505]: I0123 17:26:35.523400 2505 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 23 17:26:36.792285 kubelet[2505]: I0123 17:26:36.792232 2505 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Jan 23 17:26:36.792285 kubelet[2505]: I0123 17:26:36.792268 2505 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 23 17:26:36.794475 kubelet[2505]: I0123 17:26:36.794448 2505 watchdog_linux.go:95] "Systemd watchdog is not enabled" Jan 23 17:26:36.794475 kubelet[2505]: I0123 17:26:36.794467 2505 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 23 17:26:36.796242 kubelet[2505]: I0123 17:26:36.796186 2505 server.go:956] "Client rotation is on, will bootstrap in background" Jan 23 17:26:36.809012 kubelet[2505]: I0123 17:26:36.808969 2505 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 23 17:26:36.811923 kubelet[2505]: E0123 17:26:36.811894 2505 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.1.37:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.1.37:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jan 23 17:26:36.812645 kubelet[2505]: I0123 17:26:36.812630 2505 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 23 17:26:36.815131 kubelet[2505]: I0123 17:26:36.814961 2505 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Jan 23 17:26:36.815202 kubelet[2505]: I0123 17:26:36.815168 2505 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 23 17:26:36.815363 kubelet[2505]: I0123 17:26:36.815201 2505 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-1-0-a-d0877fd079","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 23 17:26:36.815460 kubelet[2505]: I0123 17:26:36.815365 2505 topology_manager.go:138] "Creating topology manager with none policy" Jan 23 17:26:36.815460 kubelet[2505]: I0123 17:26:36.815374 2505 container_manager_linux.go:306] "Creating device plugin manager" Jan 23 17:26:36.815501 kubelet[2505]: I0123 17:26:36.815466 2505 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Jan 23 17:26:36.819573 kubelet[2505]: I0123 17:26:36.819455 2505 state_mem.go:36] "Initialized new in-memory state store" Jan 23 17:26:36.821877 kubelet[2505]: I0123 17:26:36.821838 2505 kubelet.go:475] "Attempting to sync node with API server" Jan 23 17:26:36.821877 kubelet[2505]: I0123 17:26:36.821869 2505 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 23 17:26:36.821952 kubelet[2505]: I0123 17:26:36.821894 2505 kubelet.go:387] "Adding apiserver pod source" Jan 23 17:26:36.821952 kubelet[2505]: I0123 17:26:36.821916 2505 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 23 17:26:36.822519 kubelet[2505]: E0123 17:26:36.822455 2505 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.1.37:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-1-0-a-d0877fd079&limit=500&resourceVersion=0\": dial tcp 10.0.1.37:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 23 17:26:36.822615 kubelet[2505]: E0123 17:26:36.822576 2505 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.1.37:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.1.37:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 23 17:26:36.824633 kubelet[2505]: I0123 17:26:36.824616 2505 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 23 17:26:36.825723 kubelet[2505]: I0123 17:26:36.825472 2505 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 23 17:26:36.825723 kubelet[2505]: I0123 17:26:36.825506 2505 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Jan 23 17:26:36.825723 kubelet[2505]: W0123 17:26:36.825544 2505 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 23 17:26:36.829884 kubelet[2505]: I0123 17:26:36.829845 2505 server.go:1262] "Started kubelet" Jan 23 17:26:36.830137 kubelet[2505]: I0123 17:26:36.830107 2505 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 23 17:26:36.831633 kubelet[2505]: I0123 17:26:36.831606 2505 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 23 17:26:36.832006 kubelet[2505]: I0123 17:26:36.831983 2505 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 23 17:26:36.832594 kubelet[2505]: I0123 17:26:36.832579 2505 volume_manager.go:313] "Starting Kubelet Volume Manager" Jan 23 17:26:36.832766 kubelet[2505]: E0123 17:26:36.832750 2505 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-1-0-a-d0877fd079\" not found" Jan 23 17:26:36.832964 kubelet[2505]: I0123 17:26:36.832947 2505 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 23 17:26:36.833080 kubelet[2505]: I0123 17:26:36.833069 2505 reconciler.go:29] "Reconciler: start to sync state" Jan 23 17:26:36.833181 kubelet[2505]: I0123 17:26:36.833120 2505 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 23 17:26:36.833216 kubelet[2505]: I0123 17:26:36.833198 2505 server_v1.go:49] "podresources" method="list" useActivePods=true Jan 23 17:26:36.833476 kubelet[2505]: I0123 17:26:36.833448 2505 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 23 17:26:36.840765 kubelet[2505]: I0123 17:26:36.838908 2505 server.go:310] "Adding debug handlers to kubelet server" Jan 23 17:26:36.841772 kubelet[2505]: E0123 17:26:36.841741 2505 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.1.37:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-1-0-a-d0877fd079?timeout=10s\": dial tcp 10.0.1.37:6443: connect: connection refused" interval="200ms" Jan 23 17:26:36.842168 kubelet[2505]: E0123 17:26:36.842129 2505 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.1.37:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.1.37:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 23 17:26:36.841000 audit[2523]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2523 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:36.841000 audit[2523]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffe7154330 a2=0 a3=0 items=0 ppid=2505 pid=2523 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:36.841000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 23 17:26:36.842988 kubelet[2505]: I0123 17:26:36.842835 2505 factory.go:223] Registration of the systemd container factory successfully Jan 23 17:26:36.842988 kubelet[2505]: I0123 17:26:36.842940 2505 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 23 17:26:36.842000 audit[2524]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2524 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:36.842000 audit[2524]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc84c7230 a2=0 a3=0 items=0 ppid=2505 pid=2524 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:36.842000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 23 17:26:36.844709 kubelet[2505]: I0123 17:26:36.844689 2505 factory.go:223] Registration of the containerd container factory successfully Jan 23 17:26:36.846044 kubelet[2505]: E0123 17:26:36.843943 2505 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.1.37:6443/api/v1/namespaces/default/events\": dial tcp 10.0.1.37:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4547-1-0-a-d0877fd079.188d6c33717ff825 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-1-0-a-d0877fd079,UID:ci-4547-1-0-a-d0877fd079,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-1-0-a-d0877fd079,},FirstTimestamp:2026-01-23 17:26:36.829816869 +0000 UTC m=+1.335761325,LastTimestamp:2026-01-23 17:26:36.829816869 +0000 UTC m=+1.335761325,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-1-0-a-d0877fd079,}" Jan 23 17:26:36.844000 audit[2526]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2526 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:36.844000 audit[2526]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffd806bd80 a2=0 a3=0 items=0 ppid=2505 pid=2526 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:36.844000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 23 17:26:36.846000 audit[2528]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2528 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:36.846000 audit[2528]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffc08cedc0 a2=0 a3=0 items=0 ppid=2505 pid=2528 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:36.846000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 23 17:26:36.853000 audit[2531]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2531 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:36.853000 audit[2531]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=ffffd172a8b0 a2=0 a3=0 items=0 ppid=2505 pid=2531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:36.853000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F380000002D2D737263003132372E Jan 23 17:26:36.855340 kubelet[2505]: I0123 17:26:36.855285 2505 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Jan 23 17:26:36.855000 audit[2534]: NETFILTER_CFG table=mangle:47 family=2 entries=1 op=nft_register_chain pid=2534 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:36.855000 audit[2534]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff046de60 a2=0 a3=0 items=0 ppid=2505 pid=2534 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:36.855000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 23 17:26:36.855000 audit[2533]: NETFILTER_CFG table=mangle:48 family=10 entries=2 op=nft_register_chain pid=2533 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:36.855000 audit[2533]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffcbea1300 a2=0 a3=0 items=0 ppid=2505 pid=2533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:36.855000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 23 17:26:36.856879 kubelet[2505]: I0123 17:26:36.856716 2505 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Jan 23 17:26:36.856879 kubelet[2505]: I0123 17:26:36.856740 2505 status_manager.go:244] "Starting to sync pod status with apiserver" Jan 23 17:26:36.856879 kubelet[2505]: I0123 17:26:36.856761 2505 kubelet.go:2427] "Starting kubelet main sync loop" Jan 23 17:26:36.856879 kubelet[2505]: E0123 17:26:36.856800 2505 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 23 17:26:36.856000 audit[2535]: NETFILTER_CFG table=nat:49 family=2 entries=1 op=nft_register_chain pid=2535 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:36.856000 audit[2535]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc9f4e570 a2=0 a3=0 items=0 ppid=2505 pid=2535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:36.856000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 23 17:26:36.856000 audit[2536]: NETFILTER_CFG table=mangle:50 family=10 entries=1 op=nft_register_chain pid=2536 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:36.856000 audit[2536]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffffae66a10 a2=0 a3=0 items=0 ppid=2505 pid=2536 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:36.856000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 23 17:26:36.857000 audit[2537]: NETFILTER_CFG table=filter:51 family=2 entries=1 op=nft_register_chain pid=2537 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:36.857000 audit[2537]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffffacb7000 a2=0 a3=0 items=0 ppid=2505 pid=2537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:36.857000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 23 17:26:36.857000 audit[2538]: NETFILTER_CFG table=nat:52 family=10 entries=1 op=nft_register_chain pid=2538 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:36.857000 audit[2538]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff3751fc0 a2=0 a3=0 items=0 ppid=2505 pid=2538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:36.857000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 23 17:26:36.859548 kubelet[2505]: E0123 17:26:36.859505 2505 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.1.37:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.1.37:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 23 17:26:36.858000 audit[2539]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2539 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:36.858000 audit[2539]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe8276720 a2=0 a3=0 items=0 ppid=2505 pid=2539 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:36.858000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 23 17:26:36.859984 kubelet[2505]: I0123 17:26:36.859970 2505 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 23 17:26:36.859984 kubelet[2505]: I0123 17:26:36.859981 2505 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 23 17:26:36.860044 kubelet[2505]: I0123 17:26:36.860001 2505 state_mem.go:36] "Initialized new in-memory state store" Jan 23 17:26:36.863227 kubelet[2505]: I0123 17:26:36.863169 2505 policy_none.go:49] "None policy: Start" Jan 23 17:26:36.863227 kubelet[2505]: I0123 17:26:36.863205 2505 memory_manager.go:187] "Starting memorymanager" policy="None" Jan 23 17:26:36.863227 kubelet[2505]: I0123 17:26:36.863218 2505 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Jan 23 17:26:36.864692 kubelet[2505]: I0123 17:26:36.864651 2505 policy_none.go:47] "Start" Jan 23 17:26:36.868724 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 23 17:26:36.881822 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 23 17:26:36.885154 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 23 17:26:36.903597 kubelet[2505]: E0123 17:26:36.903558 2505 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 23 17:26:36.903769 kubelet[2505]: I0123 17:26:36.903751 2505 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 23 17:26:36.903797 kubelet[2505]: I0123 17:26:36.903768 2505 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 23 17:26:36.904430 kubelet[2505]: I0123 17:26:36.904404 2505 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 23 17:26:36.906199 kubelet[2505]: E0123 17:26:36.906178 2505 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 23 17:26:36.906275 kubelet[2505]: E0123 17:26:36.906220 2505 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4547-1-0-a-d0877fd079\" not found" Jan 23 17:26:36.966964 systemd[1]: Created slice kubepods-burstable-poda17e0b2742d3fb877d04c3a1ae9059a9.slice - libcontainer container kubepods-burstable-poda17e0b2742d3fb877d04c3a1ae9059a9.slice. Jan 23 17:26:36.981140 kubelet[2505]: E0123 17:26:36.981099 2505 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-a-d0877fd079\" not found" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:36.985404 systemd[1]: Created slice kubepods-burstable-podd8841c07b85a974cc6ea38d93a11ff09.slice - libcontainer container kubepods-burstable-podd8841c07b85a974cc6ea38d93a11ff09.slice. Jan 23 17:26:37.007333 kubelet[2505]: I0123 17:26:37.007093 2505 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:37.007493 kubelet[2505]: E0123 17:26:37.007463 2505 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-a-d0877fd079\" not found" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:37.007725 kubelet[2505]: E0123 17:26:37.007670 2505 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.1.37:6443/api/v1/nodes\": dial tcp 10.0.1.37:6443: connect: connection refused" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:37.008884 systemd[1]: Created slice kubepods-burstable-pod98539f2e02f5603d0819359de26a1522.slice - libcontainer container kubepods-burstable-pod98539f2e02f5603d0819359de26a1522.slice. Jan 23 17:26:37.010438 kubelet[2505]: E0123 17:26:37.010416 2505 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-a-d0877fd079\" not found" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:37.034813 kubelet[2505]: I0123 17:26:37.034762 2505 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d8841c07b85a974cc6ea38d93a11ff09-k8s-certs\") pod \"kube-controller-manager-ci-4547-1-0-a-d0877fd079\" (UID: \"d8841c07b85a974cc6ea38d93a11ff09\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:37.034813 kubelet[2505]: I0123 17:26:37.034810 2505 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d8841c07b85a974cc6ea38d93a11ff09-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-1-0-a-d0877fd079\" (UID: \"d8841c07b85a974cc6ea38d93a11ff09\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:37.034938 kubelet[2505]: I0123 17:26:37.034832 2505 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/98539f2e02f5603d0819359de26a1522-kubeconfig\") pod \"kube-scheduler-ci-4547-1-0-a-d0877fd079\" (UID: \"98539f2e02f5603d0819359de26a1522\") " pod="kube-system/kube-scheduler-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:37.034938 kubelet[2505]: I0123 17:26:37.034848 2505 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a17e0b2742d3fb877d04c3a1ae9059a9-ca-certs\") pod \"kube-apiserver-ci-4547-1-0-a-d0877fd079\" (UID: \"a17e0b2742d3fb877d04c3a1ae9059a9\") " pod="kube-system/kube-apiserver-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:37.034938 kubelet[2505]: I0123 17:26:37.034862 2505 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a17e0b2742d3fb877d04c3a1ae9059a9-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-1-0-a-d0877fd079\" (UID: \"a17e0b2742d3fb877d04c3a1ae9059a9\") " pod="kube-system/kube-apiserver-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:37.034938 kubelet[2505]: I0123 17:26:37.034887 2505 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d8841c07b85a974cc6ea38d93a11ff09-kubeconfig\") pod \"kube-controller-manager-ci-4547-1-0-a-d0877fd079\" (UID: \"d8841c07b85a974cc6ea38d93a11ff09\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:37.034938 kubelet[2505]: I0123 17:26:37.034918 2505 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a17e0b2742d3fb877d04c3a1ae9059a9-k8s-certs\") pod \"kube-apiserver-ci-4547-1-0-a-d0877fd079\" (UID: \"a17e0b2742d3fb877d04c3a1ae9059a9\") " pod="kube-system/kube-apiserver-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:37.035033 kubelet[2505]: I0123 17:26:37.034947 2505 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d8841c07b85a974cc6ea38d93a11ff09-ca-certs\") pod \"kube-controller-manager-ci-4547-1-0-a-d0877fd079\" (UID: \"d8841c07b85a974cc6ea38d93a11ff09\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:37.035033 kubelet[2505]: I0123 17:26:37.034974 2505 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/d8841c07b85a974cc6ea38d93a11ff09-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-1-0-a-d0877fd079\" (UID: \"d8841c07b85a974cc6ea38d93a11ff09\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:37.043445 kubelet[2505]: E0123 17:26:37.043218 2505 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.1.37:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-1-0-a-d0877fd079?timeout=10s\": dial tcp 10.0.1.37:6443: connect: connection refused" interval="400ms" Jan 23 17:26:37.209754 kubelet[2505]: I0123 17:26:37.209719 2505 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:37.210098 kubelet[2505]: E0123 17:26:37.210054 2505 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.1.37:6443/api/v1/nodes\": dial tcp 10.0.1.37:6443: connect: connection refused" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:37.285178 containerd[1675]: time="2026-01-23T17:26:37.285127422Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-1-0-a-d0877fd079,Uid:a17e0b2742d3fb877d04c3a1ae9059a9,Namespace:kube-system,Attempt:0,}" Jan 23 17:26:37.311950 containerd[1675]: time="2026-01-23T17:26:37.311713312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-1-0-a-d0877fd079,Uid:d8841c07b85a974cc6ea38d93a11ff09,Namespace:kube-system,Attempt:0,}" Jan 23 17:26:37.313657 containerd[1675]: time="2026-01-23T17:26:37.313618922Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-1-0-a-d0877fd079,Uid:98539f2e02f5603d0819359de26a1522,Namespace:kube-system,Attempt:0,}" Jan 23 17:26:37.444262 kubelet[2505]: E0123 17:26:37.444181 2505 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.1.37:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-1-0-a-d0877fd079?timeout=10s\": dial tcp 10.0.1.37:6443: connect: connection refused" interval="800ms" Jan 23 17:26:37.612420 kubelet[2505]: I0123 17:26:37.612279 2505 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:37.612792 kubelet[2505]: E0123 17:26:37.612768 2505 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.1.37:6443/api/v1/nodes\": dial tcp 10.0.1.37:6443: connect: connection refused" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:37.694350 update_engine[1647]: I20260123 17:26:37.694211 1647 update_attempter.cc:509] Updating boot flags... Jan 23 17:26:37.860106 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount868601242.mount: Deactivated successfully. Jan 23 17:26:37.866926 containerd[1675]: time="2026-01-23T17:26:37.866774795Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 17:26:37.870103 containerd[1675]: time="2026-01-23T17:26:37.870009771Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 23 17:26:37.873411 containerd[1675]: time="2026-01-23T17:26:37.873336148Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 17:26:37.874463 containerd[1675]: time="2026-01-23T17:26:37.874418433Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 17:26:37.875402 containerd[1675]: time="2026-01-23T17:26:37.875363397Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 23 17:26:37.876458 containerd[1675]: time="2026-01-23T17:26:37.876411483Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 17:26:37.878182 containerd[1675]: time="2026-01-23T17:26:37.878127891Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 23 17:26:37.880765 containerd[1675]: time="2026-01-23T17:26:37.880696064Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 23 17:26:37.881556 containerd[1675]: time="2026-01-23T17:26:37.881511628Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 567.085182ms" Jan 23 17:26:37.883222 containerd[1675]: time="2026-01-23T17:26:37.883173796Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 594.121115ms" Jan 23 17:26:37.884891 containerd[1675]: time="2026-01-23T17:26:37.884827124Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 566.217458ms" Jan 23 17:26:37.911737 containerd[1675]: time="2026-01-23T17:26:37.911625495Z" level=info msg="connecting to shim 5cf44d65875e6a2134c83dcee7272447686dec73bdbbb593ee8361d63cc1f971" address="unix:///run/containerd/s/f8d16c9df8d4a7eae414ae67a60d4ab77c3fe6a5ce80e3cef06d3b7d0f8efb33" namespace=k8s.io protocol=ttrpc version=3 Jan 23 17:26:37.917940 containerd[1675]: time="2026-01-23T17:26:37.917886886Z" level=info msg="connecting to shim 7e26985d6d93c8207e4710391631ebf09be9da0ac66b5bb1edc81400ecadbae6" address="unix:///run/containerd/s/75146ef1517a783455e5098d3a21642e585a38568855b05f2ee462b15a145d02" namespace=k8s.io protocol=ttrpc version=3 Jan 23 17:26:37.918498 containerd[1675]: time="2026-01-23T17:26:37.918460609Z" level=info msg="connecting to shim e6ceddbba3d4f7bb934acfa130d9ff80bd6018ec018211d2b939235f485941e1" address="unix:///run/containerd/s/7a39d99ed47d0dcf11bb1ef8a8b5694747785b02409b9fcb2059835b61215373" namespace=k8s.io protocol=ttrpc version=3 Jan 23 17:26:37.920201 kubelet[2505]: E0123 17:26:37.920099 2505 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.1.37:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.1.37:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 23 17:26:37.936512 systemd[1]: Started cri-containerd-5cf44d65875e6a2134c83dcee7272447686dec73bdbbb593ee8361d63cc1f971.scope - libcontainer container 5cf44d65875e6a2134c83dcee7272447686dec73bdbbb593ee8361d63cc1f971. Jan 23 17:26:37.940757 systemd[1]: Started cri-containerd-7e26985d6d93c8207e4710391631ebf09be9da0ac66b5bb1edc81400ecadbae6.scope - libcontainer container 7e26985d6d93c8207e4710391631ebf09be9da0ac66b5bb1edc81400ecadbae6. Jan 23 17:26:37.941970 systemd[1]: Started cri-containerd-e6ceddbba3d4f7bb934acfa130d9ff80bd6018ec018211d2b939235f485941e1.scope - libcontainer container e6ceddbba3d4f7bb934acfa130d9ff80bd6018ec018211d2b939235f485941e1. Jan 23 17:26:37.950000 audit: BPF prog-id=83 op=LOAD Jan 23 17:26:37.950000 audit: BPF prog-id=84 op=LOAD Jan 23 17:26:37.950000 audit[2600]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2571 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.950000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563663434643635383735653661323133346338336463656537323732 Jan 23 17:26:37.950000 audit: BPF prog-id=84 op=UNLOAD Jan 23 17:26:37.950000 audit[2600]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2571 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.950000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563663434643635383735653661323133346338336463656537323732 Jan 23 17:26:37.951000 audit: BPF prog-id=85 op=LOAD Jan 23 17:26:37.951000 audit[2600]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2571 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.951000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563663434643635383735653661323133346338336463656537323732 Jan 23 17:26:37.951000 audit: BPF prog-id=86 op=LOAD Jan 23 17:26:37.951000 audit[2600]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2571 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.951000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563663434643635383735653661323133346338336463656537323732 Jan 23 17:26:37.951000 audit: BPF prog-id=86 op=UNLOAD Jan 23 17:26:37.951000 audit[2600]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2571 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.951000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563663434643635383735653661323133346338336463656537323732 Jan 23 17:26:37.952000 audit: BPF prog-id=85 op=UNLOAD Jan 23 17:26:37.952000 audit[2600]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2571 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.952000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563663434643635383735653661323133346338336463656537323732 Jan 23 17:26:37.952000 audit: BPF prog-id=87 op=LOAD Jan 23 17:26:37.952000 audit[2600]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2571 pid=2600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.952000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563663434643635383735653661323133346338336463656537323732 Jan 23 17:26:37.953000 audit: BPF prog-id=88 op=LOAD Jan 23 17:26:37.953000 audit: BPF prog-id=89 op=LOAD Jan 23 17:26:37.953000 audit[2627]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2597 pid=2627 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.953000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6536636564646262613364346637626239333461636661313330643966 Jan 23 17:26:37.953000 audit: BPF prog-id=89 op=UNLOAD Jan 23 17:26:37.953000 audit[2627]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2597 pid=2627 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.953000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6536636564646262613364346637626239333461636661313330643966 Jan 23 17:26:37.954000 audit: BPF prog-id=90 op=LOAD Jan 23 17:26:37.954000 audit[2627]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2597 pid=2627 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.954000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6536636564646262613364346637626239333461636661313330643966 Jan 23 17:26:37.954000 audit: BPF prog-id=91 op=LOAD Jan 23 17:26:37.954000 audit[2627]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2597 pid=2627 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.954000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6536636564646262613364346637626239333461636661313330643966 Jan 23 17:26:37.954000 audit: BPF prog-id=91 op=UNLOAD Jan 23 17:26:37.954000 audit[2627]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2597 pid=2627 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.954000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6536636564646262613364346637626239333461636661313330643966 Jan 23 17:26:37.954000 audit: BPF prog-id=90 op=UNLOAD Jan 23 17:26:37.954000 audit[2627]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2597 pid=2627 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.954000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6536636564646262613364346637626239333461636661313330643966 Jan 23 17:26:37.954000 audit: BPF prog-id=92 op=LOAD Jan 23 17:26:37.954000 audit[2627]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2597 pid=2627 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.954000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6536636564646262613364346637626239333461636661313330643966 Jan 23 17:26:37.955000 audit: BPF prog-id=93 op=LOAD Jan 23 17:26:37.956000 audit: BPF prog-id=94 op=LOAD Jan 23 17:26:37.956000 audit[2625]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=2595 pid=2625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.956000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765323639383564366439336338323037653437313033393136333165 Jan 23 17:26:37.956000 audit: BPF prog-id=94 op=UNLOAD Jan 23 17:26:37.956000 audit[2625]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2595 pid=2625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.956000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765323639383564366439336338323037653437313033393136333165 Jan 23 17:26:37.956000 audit: BPF prog-id=95 op=LOAD Jan 23 17:26:37.956000 audit[2625]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=2595 pid=2625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.956000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765323639383564366439336338323037653437313033393136333165 Jan 23 17:26:37.957000 audit: BPF prog-id=96 op=LOAD Jan 23 17:26:37.957000 audit[2625]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=2595 pid=2625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.957000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765323639383564366439336338323037653437313033393136333165 Jan 23 17:26:37.957000 audit: BPF prog-id=96 op=UNLOAD Jan 23 17:26:37.957000 audit[2625]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2595 pid=2625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.957000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765323639383564366439336338323037653437313033393136333165 Jan 23 17:26:37.957000 audit: BPF prog-id=95 op=UNLOAD Jan 23 17:26:37.957000 audit[2625]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2595 pid=2625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.957000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765323639383564366439336338323037653437313033393136333165 Jan 23 17:26:37.958000 audit: BPF prog-id=97 op=LOAD Jan 23 17:26:37.958000 audit[2625]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=2595 pid=2625 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:37.958000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765323639383564366439336338323037653437313033393136333165 Jan 23 17:26:37.986821 containerd[1675]: time="2026-01-23T17:26:37.986703984Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-1-0-a-d0877fd079,Uid:d8841c07b85a974cc6ea38d93a11ff09,Namespace:kube-system,Attempt:0,} returns sandbox id \"5cf44d65875e6a2134c83dcee7272447686dec73bdbbb593ee8361d63cc1f971\"" Jan 23 17:26:37.991217 containerd[1675]: time="2026-01-23T17:26:37.991167846Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-1-0-a-d0877fd079,Uid:a17e0b2742d3fb877d04c3a1ae9059a9,Namespace:kube-system,Attempt:0,} returns sandbox id \"e6ceddbba3d4f7bb934acfa130d9ff80bd6018ec018211d2b939235f485941e1\"" Jan 23 17:26:37.992341 containerd[1675]: time="2026-01-23T17:26:37.992196651Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-1-0-a-d0877fd079,Uid:98539f2e02f5603d0819359de26a1522,Namespace:kube-system,Attempt:0,} returns sandbox id \"7e26985d6d93c8207e4710391631ebf09be9da0ac66b5bb1edc81400ecadbae6\"" Jan 23 17:26:37.993017 containerd[1675]: time="2026-01-23T17:26:37.992988534Z" level=info msg="CreateContainer within sandbox \"5cf44d65875e6a2134c83dcee7272447686dec73bdbbb593ee8361d63cc1f971\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 23 17:26:37.995377 containerd[1675]: time="2026-01-23T17:26:37.995344786Z" level=info msg="CreateContainer within sandbox \"e6ceddbba3d4f7bb934acfa130d9ff80bd6018ec018211d2b939235f485941e1\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 23 17:26:37.997367 containerd[1675]: time="2026-01-23T17:26:37.997335996Z" level=info msg="CreateContainer within sandbox \"7e26985d6d93c8207e4710391631ebf09be9da0ac66b5bb1edc81400ecadbae6\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 23 17:26:38.005390 containerd[1675]: time="2026-01-23T17:26:38.004484111Z" level=info msg="Container 14d19741bf4fdaed72a5a9039eb4808e1f08862dfd7fd14da54d3bd83383027a: CDI devices from CRI Config.CDIDevices: []" Jan 23 17:26:38.020571 containerd[1675]: time="2026-01-23T17:26:38.020483989Z" level=info msg="CreateContainer within sandbox \"e6ceddbba3d4f7bb934acfa130d9ff80bd6018ec018211d2b939235f485941e1\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"14d19741bf4fdaed72a5a9039eb4808e1f08862dfd7fd14da54d3bd83383027a\"" Jan 23 17:26:38.021203 containerd[1675]: time="2026-01-23T17:26:38.021177073Z" level=info msg="StartContainer for \"14d19741bf4fdaed72a5a9039eb4808e1f08862dfd7fd14da54d3bd83383027a\"" Jan 23 17:26:38.023271 containerd[1675]: time="2026-01-23T17:26:38.023142922Z" level=info msg="connecting to shim 14d19741bf4fdaed72a5a9039eb4808e1f08862dfd7fd14da54d3bd83383027a" address="unix:///run/containerd/s/7a39d99ed47d0dcf11bb1ef8a8b5694747785b02409b9fcb2059835b61215373" protocol=ttrpc version=3 Jan 23 17:26:38.024587 containerd[1675]: time="2026-01-23T17:26:38.024562569Z" level=info msg="Container 5c2e7756de7eab65f7b700c80d989c570101c0ff9cda4fb1c227ce4ccffa573f: CDI devices from CRI Config.CDIDevices: []" Jan 23 17:26:38.025041 containerd[1675]: time="2026-01-23T17:26:38.025012172Z" level=info msg="Container 32aed78b76544c076f5b266dacdad0c5e108e6f3baf421a20aa3fd075bda8705: CDI devices from CRI Config.CDIDevices: []" Jan 23 17:26:38.034905 containerd[1675]: time="2026-01-23T17:26:38.034834820Z" level=info msg="CreateContainer within sandbox \"7e26985d6d93c8207e4710391631ebf09be9da0ac66b5bb1edc81400ecadbae6\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"32aed78b76544c076f5b266dacdad0c5e108e6f3baf421a20aa3fd075bda8705\"" Jan 23 17:26:38.035374 containerd[1675]: time="2026-01-23T17:26:38.035346342Z" level=info msg="StartContainer for \"32aed78b76544c076f5b266dacdad0c5e108e6f3baf421a20aa3fd075bda8705\"" Jan 23 17:26:38.036046 containerd[1675]: time="2026-01-23T17:26:38.036006145Z" level=info msg="CreateContainer within sandbox \"5cf44d65875e6a2134c83dcee7272447686dec73bdbbb593ee8361d63cc1f971\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"5c2e7756de7eab65f7b700c80d989c570101c0ff9cda4fb1c227ce4ccffa573f\"" Jan 23 17:26:38.036499 containerd[1675]: time="2026-01-23T17:26:38.036385947Z" level=info msg="StartContainer for \"5c2e7756de7eab65f7b700c80d989c570101c0ff9cda4fb1c227ce4ccffa573f\"" Jan 23 17:26:38.036699 containerd[1675]: time="2026-01-23T17:26:38.036652909Z" level=info msg="connecting to shim 32aed78b76544c076f5b266dacdad0c5e108e6f3baf421a20aa3fd075bda8705" address="unix:///run/containerd/s/75146ef1517a783455e5098d3a21642e585a38568855b05f2ee462b15a145d02" protocol=ttrpc version=3 Jan 23 17:26:38.037775 containerd[1675]: time="2026-01-23T17:26:38.037723954Z" level=info msg="connecting to shim 5c2e7756de7eab65f7b700c80d989c570101c0ff9cda4fb1c227ce4ccffa573f" address="unix:///run/containerd/s/f8d16c9df8d4a7eae414ae67a60d4ab77c3fe6a5ce80e3cef06d3b7d0f8efb33" protocol=ttrpc version=3 Jan 23 17:26:38.043523 systemd[1]: Started cri-containerd-14d19741bf4fdaed72a5a9039eb4808e1f08862dfd7fd14da54d3bd83383027a.scope - libcontainer container 14d19741bf4fdaed72a5a9039eb4808e1f08862dfd7fd14da54d3bd83383027a. Jan 23 17:26:38.053281 kubelet[2505]: E0123 17:26:38.052836 2505 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.1.37:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.1.37:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 23 17:26:38.060762 systemd[1]: Started cri-containerd-32aed78b76544c076f5b266dacdad0c5e108e6f3baf421a20aa3fd075bda8705.scope - libcontainer container 32aed78b76544c076f5b266dacdad0c5e108e6f3baf421a20aa3fd075bda8705. Jan 23 17:26:38.064126 systemd[1]: Started cri-containerd-5c2e7756de7eab65f7b700c80d989c570101c0ff9cda4fb1c227ce4ccffa573f.scope - libcontainer container 5c2e7756de7eab65f7b700c80d989c570101c0ff9cda4fb1c227ce4ccffa573f. Jan 23 17:26:38.066000 audit: BPF prog-id=98 op=LOAD Jan 23 17:26:38.068000 audit: BPF prog-id=99 op=LOAD Jan 23 17:26:38.068000 audit[2703]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2597 pid=2703 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134643139373431626634666461656437326135613930333965623438 Jan 23 17:26:38.068000 audit: BPF prog-id=99 op=UNLOAD Jan 23 17:26:38.068000 audit[2703]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2597 pid=2703 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134643139373431626634666461656437326135613930333965623438 Jan 23 17:26:38.068000 audit: BPF prog-id=100 op=LOAD Jan 23 17:26:38.068000 audit[2703]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2597 pid=2703 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134643139373431626634666461656437326135613930333965623438 Jan 23 17:26:38.068000 audit: BPF prog-id=101 op=LOAD Jan 23 17:26:38.068000 audit[2703]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2597 pid=2703 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134643139373431626634666461656437326135613930333965623438 Jan 23 17:26:38.068000 audit: BPF prog-id=101 op=UNLOAD Jan 23 17:26:38.068000 audit[2703]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2597 pid=2703 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134643139373431626634666461656437326135613930333965623438 Jan 23 17:26:38.068000 audit: BPF prog-id=100 op=UNLOAD Jan 23 17:26:38.068000 audit[2703]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2597 pid=2703 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134643139373431626634666461656437326135613930333965623438 Jan 23 17:26:38.068000 audit: BPF prog-id=102 op=LOAD Jan 23 17:26:38.068000 audit[2703]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2597 pid=2703 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.068000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3134643139373431626634666461656437326135613930333965623438 Jan 23 17:26:38.076000 audit: BPF prog-id=103 op=LOAD Jan 23 17:26:38.076000 audit: BPF prog-id=104 op=LOAD Jan 23 17:26:38.076000 audit[2715]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2595 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332616564373862373635343463303736663562323636646163646164 Jan 23 17:26:38.076000 audit: BPF prog-id=104 op=UNLOAD Jan 23 17:26:38.076000 audit[2715]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2595 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332616564373862373635343463303736663562323636646163646164 Jan 23 17:26:38.076000 audit: BPF prog-id=105 op=LOAD Jan 23 17:26:38.076000 audit[2715]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2595 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332616564373862373635343463303736663562323636646163646164 Jan 23 17:26:38.076000 audit: BPF prog-id=106 op=LOAD Jan 23 17:26:38.076000 audit[2715]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2595 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332616564373862373635343463303736663562323636646163646164 Jan 23 17:26:38.076000 audit: BPF prog-id=106 op=UNLOAD Jan 23 17:26:38.076000 audit[2715]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2595 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332616564373862373635343463303736663562323636646163646164 Jan 23 17:26:38.076000 audit: BPF prog-id=105 op=UNLOAD Jan 23 17:26:38.076000 audit[2715]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2595 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332616564373862373635343463303736663562323636646163646164 Jan 23 17:26:38.076000 audit: BPF prog-id=107 op=LOAD Jan 23 17:26:38.076000 audit[2715]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2595 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332616564373862373635343463303736663562323636646163646164 Jan 23 17:26:38.077000 audit: BPF prog-id=108 op=LOAD Jan 23 17:26:38.078000 audit: BPF prog-id=109 op=LOAD Jan 23 17:26:38.078000 audit[2716]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2571 pid=2716 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.078000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563326537373536646537656162363566376237303063383064393839 Jan 23 17:26:38.078000 audit: BPF prog-id=109 op=UNLOAD Jan 23 17:26:38.078000 audit[2716]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2571 pid=2716 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.078000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563326537373536646537656162363566376237303063383064393839 Jan 23 17:26:38.078000 audit: BPF prog-id=110 op=LOAD Jan 23 17:26:38.078000 audit[2716]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2571 pid=2716 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.078000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563326537373536646537656162363566376237303063383064393839 Jan 23 17:26:38.078000 audit: BPF prog-id=111 op=LOAD Jan 23 17:26:38.078000 audit[2716]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2571 pid=2716 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.078000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563326537373536646537656162363566376237303063383064393839 Jan 23 17:26:38.078000 audit: BPF prog-id=111 op=UNLOAD Jan 23 17:26:38.078000 audit[2716]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2571 pid=2716 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.078000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563326537373536646537656162363566376237303063383064393839 Jan 23 17:26:38.078000 audit: BPF prog-id=110 op=UNLOAD Jan 23 17:26:38.078000 audit[2716]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2571 pid=2716 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.078000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563326537373536646537656162363566376237303063383064393839 Jan 23 17:26:38.078000 audit: BPF prog-id=112 op=LOAD Jan 23 17:26:38.078000 audit[2716]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2571 pid=2716 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:38.078000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3563326537373536646537656162363566376237303063383064393839 Jan 23 17:26:38.107478 containerd[1675]: time="2026-01-23T17:26:38.107432376Z" level=info msg="StartContainer for \"14d19741bf4fdaed72a5a9039eb4808e1f08862dfd7fd14da54d3bd83383027a\" returns successfully" Jan 23 17:26:38.121732 containerd[1675]: time="2026-01-23T17:26:38.121540805Z" level=info msg="StartContainer for \"5c2e7756de7eab65f7b700c80d989c570101c0ff9cda4fb1c227ce4ccffa573f\" returns successfully" Jan 23 17:26:38.122539 containerd[1675]: time="2026-01-23T17:26:38.122512010Z" level=info msg="StartContainer for \"32aed78b76544c076f5b266dacdad0c5e108e6f3baf421a20aa3fd075bda8705\" returns successfully" Jan 23 17:26:38.158222 kubelet[2505]: E0123 17:26:38.158100 2505 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.1.37:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.1.37:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 23 17:26:38.416599 kubelet[2505]: I0123 17:26:38.416565 2505 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:38.870536 kubelet[2505]: E0123 17:26:38.870428 2505 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-a-d0877fd079\" not found" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:38.871673 kubelet[2505]: E0123 17:26:38.871228 2505 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-a-d0877fd079\" not found" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:38.874046 kubelet[2505]: E0123 17:26:38.874023 2505 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-1-0-a-d0877fd079\" not found" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:39.618453 kubelet[2505]: E0123 17:26:39.618396 2505 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4547-1-0-a-d0877fd079\" not found" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:39.708154 kubelet[2505]: I0123 17:26:39.708027 2505 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:39.708154 kubelet[2505]: E0123 17:26:39.708158 2505 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"ci-4547-1-0-a-d0877fd079\": node \"ci-4547-1-0-a-d0877fd079\" not found" Jan 23 17:26:39.733725 kubelet[2505]: I0123 17:26:39.733690 2505 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:39.742090 kubelet[2505]: E0123 17:26:39.742060 2505 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-1-0-a-d0877fd079\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:39.742090 kubelet[2505]: I0123 17:26:39.742090 2505 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:39.744266 kubelet[2505]: E0123 17:26:39.744236 2505 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-1-0-a-d0877fd079\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:39.744266 kubelet[2505]: I0123 17:26:39.744266 2505 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:39.746036 kubelet[2505]: E0123 17:26:39.746012 2505 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-1-0-a-d0877fd079\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:39.822883 kubelet[2505]: I0123 17:26:39.822848 2505 apiserver.go:52] "Watching apiserver" Jan 23 17:26:39.834324 kubelet[2505]: I0123 17:26:39.834246 2505 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 23 17:26:39.874920 kubelet[2505]: I0123 17:26:39.874577 2505 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:39.874920 kubelet[2505]: I0123 17:26:39.874898 2505 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:39.877146 kubelet[2505]: E0123 17:26:39.877035 2505 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-1-0-a-d0877fd079\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:39.877748 kubelet[2505]: E0123 17:26:39.877721 2505 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-1-0-a-d0877fd079\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:40.645399 kubelet[2505]: I0123 17:26:40.645283 2505 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:40.876360 kubelet[2505]: I0123 17:26:40.876205 2505 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:41.960994 systemd[1]: Reload requested from client PID 2808 ('systemctl') (unit session-8.scope)... Jan 23 17:26:41.961021 systemd[1]: Reloading... Jan 23 17:26:42.046338 zram_generator::config[2855]: No configuration found. Jan 23 17:26:42.056078 kubelet[2505]: I0123 17:26:42.056040 2505 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.230710 systemd[1]: Reloading finished in 269 ms. Jan 23 17:26:42.266232 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 17:26:42.282415 systemd[1]: kubelet.service: Deactivated successfully. Jan 23 17:26:42.282710 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 17:26:42.281000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:42.282784 systemd[1]: kubelet.service: Consumed 1.737s CPU time, 124.1M memory peak. Jan 23 17:26:42.283647 kernel: kauditd_printk_skb: 202 callbacks suppressed Jan 23 17:26:42.283718 kernel: audit: type=1131 audit(1769189202.281:394): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:42.286559 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 23 17:26:42.286000 audit: BPF prog-id=113 op=LOAD Jan 23 17:26:42.286000 audit: BPF prog-id=80 op=UNLOAD Jan 23 17:26:42.287000 audit: BPF prog-id=114 op=LOAD Jan 23 17:26:42.287000 audit: BPF prog-id=115 op=LOAD Jan 23 17:26:42.291291 kernel: audit: type=1334 audit(1769189202.286:395): prog-id=113 op=LOAD Jan 23 17:26:42.291358 kernel: audit: type=1334 audit(1769189202.286:396): prog-id=80 op=UNLOAD Jan 23 17:26:42.291378 kernel: audit: type=1334 audit(1769189202.287:397): prog-id=114 op=LOAD Jan 23 17:26:42.291395 kernel: audit: type=1334 audit(1769189202.287:398): prog-id=115 op=LOAD Jan 23 17:26:42.291412 kernel: audit: type=1334 audit(1769189202.287:399): prog-id=81 op=UNLOAD Jan 23 17:26:42.291429 kernel: audit: type=1334 audit(1769189202.287:400): prog-id=82 op=UNLOAD Jan 23 17:26:42.287000 audit: BPF prog-id=81 op=UNLOAD Jan 23 17:26:42.287000 audit: BPF prog-id=82 op=UNLOAD Jan 23 17:26:42.288000 audit: BPF prog-id=116 op=LOAD Jan 23 17:26:42.292713 kernel: audit: type=1334 audit(1769189202.288:401): prog-id=116 op=LOAD Jan 23 17:26:42.292750 kernel: audit: type=1334 audit(1769189202.288:402): prog-id=77 op=UNLOAD Jan 23 17:26:42.288000 audit: BPF prog-id=77 op=UNLOAD Jan 23 17:26:42.289000 audit: BPF prog-id=117 op=LOAD Jan 23 17:26:42.294327 kernel: audit: type=1334 audit(1769189202.289:403): prog-id=117 op=LOAD Jan 23 17:26:42.300000 audit: BPF prog-id=118 op=LOAD Jan 23 17:26:42.300000 audit: BPF prog-id=78 op=UNLOAD Jan 23 17:26:42.300000 audit: BPF prog-id=79 op=UNLOAD Jan 23 17:26:42.300000 audit: BPF prog-id=119 op=LOAD Jan 23 17:26:42.300000 audit: BPF prog-id=64 op=UNLOAD Jan 23 17:26:42.300000 audit: BPF prog-id=120 op=LOAD Jan 23 17:26:42.300000 audit: BPF prog-id=121 op=LOAD Jan 23 17:26:42.300000 audit: BPF prog-id=65 op=UNLOAD Jan 23 17:26:42.300000 audit: BPF prog-id=66 op=UNLOAD Jan 23 17:26:42.301000 audit: BPF prog-id=122 op=LOAD Jan 23 17:26:42.301000 audit: BPF prog-id=70 op=UNLOAD Jan 23 17:26:42.302000 audit: BPF prog-id=123 op=LOAD Jan 23 17:26:42.302000 audit: BPF prog-id=124 op=LOAD Jan 23 17:26:42.302000 audit: BPF prog-id=68 op=UNLOAD Jan 23 17:26:42.302000 audit: BPF prog-id=69 op=UNLOAD Jan 23 17:26:42.303000 audit: BPF prog-id=125 op=LOAD Jan 23 17:26:42.303000 audit: BPF prog-id=74 op=UNLOAD Jan 23 17:26:42.303000 audit: BPF prog-id=126 op=LOAD Jan 23 17:26:42.303000 audit: BPF prog-id=127 op=LOAD Jan 23 17:26:42.303000 audit: BPF prog-id=75 op=UNLOAD Jan 23 17:26:42.303000 audit: BPF prog-id=76 op=UNLOAD Jan 23 17:26:42.304000 audit: BPF prog-id=128 op=LOAD Jan 23 17:26:42.304000 audit: BPF prog-id=67 op=UNLOAD Jan 23 17:26:42.305000 audit: BPF prog-id=129 op=LOAD Jan 23 17:26:42.305000 audit: BPF prog-id=71 op=UNLOAD Jan 23 17:26:42.305000 audit: BPF prog-id=130 op=LOAD Jan 23 17:26:42.305000 audit: BPF prog-id=131 op=LOAD Jan 23 17:26:42.305000 audit: BPF prog-id=72 op=UNLOAD Jan 23 17:26:42.305000 audit: BPF prog-id=73 op=UNLOAD Jan 23 17:26:42.305000 audit: BPF prog-id=132 op=LOAD Jan 23 17:26:42.305000 audit: BPF prog-id=63 op=UNLOAD Jan 23 17:26:42.446364 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 23 17:26:42.445000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:42.461632 (kubelet)[2900]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 23 17:26:42.502214 kubelet[2900]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 23 17:26:42.502214 kubelet[2900]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 23 17:26:42.502214 kubelet[2900]: I0123 17:26:42.502174 2900 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 23 17:26:42.510355 kubelet[2900]: I0123 17:26:42.510297 2900 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Jan 23 17:26:42.511336 kubelet[2900]: I0123 17:26:42.510497 2900 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 23 17:26:42.511336 kubelet[2900]: I0123 17:26:42.510536 2900 watchdog_linux.go:95] "Systemd watchdog is not enabled" Jan 23 17:26:42.511336 kubelet[2900]: I0123 17:26:42.510542 2900 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 23 17:26:42.511336 kubelet[2900]: I0123 17:26:42.510752 2900 server.go:956] "Client rotation is on, will bootstrap in background" Jan 23 17:26:42.511986 kubelet[2900]: I0123 17:26:42.511956 2900 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jan 23 17:26:42.514070 kubelet[2900]: I0123 17:26:42.514049 2900 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 23 17:26:42.517247 kubelet[2900]: I0123 17:26:42.517226 2900 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 23 17:26:42.519638 kubelet[2900]: I0123 17:26:42.519616 2900 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Jan 23 17:26:42.520053 kubelet[2900]: I0123 17:26:42.520028 2900 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 23 17:26:42.520192 kubelet[2900]: I0123 17:26:42.520053 2900 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-1-0-a-d0877fd079","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 23 17:26:42.520264 kubelet[2900]: I0123 17:26:42.520193 2900 topology_manager.go:138] "Creating topology manager with none policy" Jan 23 17:26:42.520264 kubelet[2900]: I0123 17:26:42.520203 2900 container_manager_linux.go:306] "Creating device plugin manager" Jan 23 17:26:42.520264 kubelet[2900]: I0123 17:26:42.520225 2900 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Jan 23 17:26:42.522934 kubelet[2900]: I0123 17:26:42.522905 2900 state_mem.go:36] "Initialized new in-memory state store" Jan 23 17:26:42.523095 kubelet[2900]: I0123 17:26:42.523082 2900 kubelet.go:475] "Attempting to sync node with API server" Jan 23 17:26:42.523126 kubelet[2900]: I0123 17:26:42.523100 2900 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 23 17:26:42.523126 kubelet[2900]: I0123 17:26:42.523125 2900 kubelet.go:387] "Adding apiserver pod source" Jan 23 17:26:42.523851 kubelet[2900]: I0123 17:26:42.523140 2900 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 23 17:26:42.527313 kubelet[2900]: I0123 17:26:42.524516 2900 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 23 17:26:42.527313 kubelet[2900]: I0123 17:26:42.525143 2900 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 23 17:26:42.527313 kubelet[2900]: I0123 17:26:42.525173 2900 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Jan 23 17:26:42.527313 kubelet[2900]: I0123 17:26:42.527151 2900 server.go:1262] "Started kubelet" Jan 23 17:26:42.527492 kubelet[2900]: I0123 17:26:42.527452 2900 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 23 17:26:42.527524 kubelet[2900]: I0123 17:26:42.527465 2900 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 23 17:26:42.527560 kubelet[2900]: I0123 17:26:42.527539 2900 server_v1.go:49] "podresources" method="list" useActivePods=true Jan 23 17:26:42.527764 kubelet[2900]: I0123 17:26:42.527736 2900 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 23 17:26:42.528599 kubelet[2900]: I0123 17:26:42.528581 2900 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 23 17:26:42.528915 kubelet[2900]: I0123 17:26:42.528883 2900 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 23 17:26:42.529319 kubelet[2900]: I0123 17:26:42.529289 2900 server.go:310] "Adding debug handlers to kubelet server" Jan 23 17:26:42.534211 kubelet[2900]: E0123 17:26:42.532722 2900 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-1-0-a-d0877fd079\" not found" Jan 23 17:26:42.534211 kubelet[2900]: I0123 17:26:42.532768 2900 volume_manager.go:313] "Starting Kubelet Volume Manager" Jan 23 17:26:42.534211 kubelet[2900]: I0123 17:26:42.532953 2900 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 23 17:26:42.534211 kubelet[2900]: I0123 17:26:42.533063 2900 reconciler.go:29] "Reconciler: start to sync state" Jan 23 17:26:42.541515 kubelet[2900]: E0123 17:26:42.541478 2900 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 23 17:26:42.543413 kubelet[2900]: I0123 17:26:42.543013 2900 factory.go:223] Registration of the systemd container factory successfully Jan 23 17:26:42.543413 kubelet[2900]: I0123 17:26:42.543206 2900 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 23 17:26:42.547394 kubelet[2900]: I0123 17:26:42.547071 2900 factory.go:223] Registration of the containerd container factory successfully Jan 23 17:26:42.556327 kubelet[2900]: I0123 17:26:42.556020 2900 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Jan 23 17:26:42.559695 kubelet[2900]: I0123 17:26:42.559566 2900 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Jan 23 17:26:42.559695 kubelet[2900]: I0123 17:26:42.559598 2900 status_manager.go:244] "Starting to sync pod status with apiserver" Jan 23 17:26:42.559695 kubelet[2900]: I0123 17:26:42.559618 2900 kubelet.go:2427] "Starting kubelet main sync loop" Jan 23 17:26:42.560268 kubelet[2900]: E0123 17:26:42.560080 2900 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 23 17:26:42.587926 kubelet[2900]: I0123 17:26:42.587899 2900 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 23 17:26:42.587926 kubelet[2900]: I0123 17:26:42.587928 2900 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 23 17:26:42.588077 kubelet[2900]: I0123 17:26:42.587949 2900 state_mem.go:36] "Initialized new in-memory state store" Jan 23 17:26:42.588103 kubelet[2900]: I0123 17:26:42.588075 2900 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 23 17:26:42.588103 kubelet[2900]: I0123 17:26:42.588085 2900 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 23 17:26:42.588103 kubelet[2900]: I0123 17:26:42.588102 2900 policy_none.go:49] "None policy: Start" Jan 23 17:26:42.588163 kubelet[2900]: I0123 17:26:42.588109 2900 memory_manager.go:187] "Starting memorymanager" policy="None" Jan 23 17:26:42.588163 kubelet[2900]: I0123 17:26:42.588117 2900 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Jan 23 17:26:42.588240 kubelet[2900]: I0123 17:26:42.588205 2900 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Jan 23 17:26:42.588240 kubelet[2900]: I0123 17:26:42.588219 2900 policy_none.go:47] "Start" Jan 23 17:26:42.593498 kubelet[2900]: E0123 17:26:42.593352 2900 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 23 17:26:42.593587 kubelet[2900]: I0123 17:26:42.593529 2900 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 23 17:26:42.593587 kubelet[2900]: I0123 17:26:42.593541 2900 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 23 17:26:42.594363 kubelet[2900]: I0123 17:26:42.594211 2900 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 23 17:26:42.595732 kubelet[2900]: E0123 17:26:42.594670 2900 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 23 17:26:42.662126 kubelet[2900]: I0123 17:26:42.662086 2900 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.663285 kubelet[2900]: I0123 17:26:42.662334 2900 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.663285 kubelet[2900]: I0123 17:26:42.662090 2900 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.670500 kubelet[2900]: E0123 17:26:42.670466 2900 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-1-0-a-d0877fd079\" already exists" pod="kube-system/kube-apiserver-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.671021 kubelet[2900]: E0123 17:26:42.670998 2900 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-1-0-a-d0877fd079\" already exists" pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.671192 kubelet[2900]: E0123 17:26:42.671157 2900 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-1-0-a-d0877fd079\" already exists" pod="kube-system/kube-scheduler-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.696866 kubelet[2900]: I0123 17:26:42.696841 2900 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.704914 kubelet[2900]: I0123 17:26:42.704849 2900 kubelet_node_status.go:124] "Node was previously registered" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.705407 kubelet[2900]: I0123 17:26:42.705055 2900 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.834489 kubelet[2900]: I0123 17:26:42.834371 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a17e0b2742d3fb877d04c3a1ae9059a9-k8s-certs\") pod \"kube-apiserver-ci-4547-1-0-a-d0877fd079\" (UID: \"a17e0b2742d3fb877d04c3a1ae9059a9\") " pod="kube-system/kube-apiserver-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.834489 kubelet[2900]: I0123 17:26:42.834434 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a17e0b2742d3fb877d04c3a1ae9059a9-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-1-0-a-d0877fd079\" (UID: \"a17e0b2742d3fb877d04c3a1ae9059a9\") " pod="kube-system/kube-apiserver-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.834489 kubelet[2900]: I0123 17:26:42.834457 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d8841c07b85a974cc6ea38d93a11ff09-k8s-certs\") pod \"kube-controller-manager-ci-4547-1-0-a-d0877fd079\" (UID: \"d8841c07b85a974cc6ea38d93a11ff09\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.834489 kubelet[2900]: I0123 17:26:42.834473 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d8841c07b85a974cc6ea38d93a11ff09-kubeconfig\") pod \"kube-controller-manager-ci-4547-1-0-a-d0877fd079\" (UID: \"d8841c07b85a974cc6ea38d93a11ff09\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.834489 kubelet[2900]: I0123 17:26:42.834489 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d8841c07b85a974cc6ea38d93a11ff09-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-1-0-a-d0877fd079\" (UID: \"d8841c07b85a974cc6ea38d93a11ff09\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.834690 kubelet[2900]: I0123 17:26:42.834507 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d8841c07b85a974cc6ea38d93a11ff09-ca-certs\") pod \"kube-controller-manager-ci-4547-1-0-a-d0877fd079\" (UID: \"d8841c07b85a974cc6ea38d93a11ff09\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.834690 kubelet[2900]: I0123 17:26:42.834543 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/d8841c07b85a974cc6ea38d93a11ff09-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-1-0-a-d0877fd079\" (UID: \"d8841c07b85a974cc6ea38d93a11ff09\") " pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.834690 kubelet[2900]: I0123 17:26:42.834567 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/98539f2e02f5603d0819359de26a1522-kubeconfig\") pod \"kube-scheduler-ci-4547-1-0-a-d0877fd079\" (UID: \"98539f2e02f5603d0819359de26a1522\") " pod="kube-system/kube-scheduler-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:42.834690 kubelet[2900]: I0123 17:26:42.834605 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a17e0b2742d3fb877d04c3a1ae9059a9-ca-certs\") pod \"kube-apiserver-ci-4547-1-0-a-d0877fd079\" (UID: \"a17e0b2742d3fb877d04c3a1ae9059a9\") " pod="kube-system/kube-apiserver-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:43.524655 kubelet[2900]: I0123 17:26:43.524603 2900 apiserver.go:52] "Watching apiserver" Jan 23 17:26:43.533927 kubelet[2900]: I0123 17:26:43.533884 2900 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 23 17:26:43.580104 kubelet[2900]: I0123 17:26:43.579730 2900 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:43.580104 kubelet[2900]: I0123 17:26:43.580060 2900 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:43.588157 kubelet[2900]: E0123 17:26:43.587520 2900 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-1-0-a-d0877fd079\" already exists" pod="kube-system/kube-apiserver-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:43.588450 kubelet[2900]: E0123 17:26:43.588241 2900 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-1-0-a-d0877fd079\" already exists" pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" Jan 23 17:26:43.599511 kubelet[2900]: I0123 17:26:43.599448 2900 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4547-1-0-a-d0877fd079" podStartSLOduration=3.599431157 podStartE2EDuration="3.599431157s" podCreationTimestamp="2026-01-23 17:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 17:26:43.599248316 +0000 UTC m=+1.134791968" watchObservedRunningTime="2026-01-23 17:26:43.599431157 +0000 UTC m=+1.134974849" Jan 23 17:26:43.618718 kubelet[2900]: I0123 17:26:43.618594 2900 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4547-1-0-a-d0877fd079" podStartSLOduration=3.618576891 podStartE2EDuration="3.618576891s" podCreationTimestamp="2026-01-23 17:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 17:26:43.609180005 +0000 UTC m=+1.144723697" watchObservedRunningTime="2026-01-23 17:26:43.618576891 +0000 UTC m=+1.154120583" Jan 23 17:26:43.619054 kubelet[2900]: I0123 17:26:43.619017 2900 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4547-1-0-a-d0877fd079" podStartSLOduration=1.618930133 podStartE2EDuration="1.618930133s" podCreationTimestamp="2026-01-23 17:26:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 17:26:43.617804887 +0000 UTC m=+1.153348579" watchObservedRunningTime="2026-01-23 17:26:43.618930133 +0000 UTC m=+1.154473865" Jan 23 17:26:47.749809 kubelet[2900]: I0123 17:26:47.749763 2900 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 23 17:26:47.750767 containerd[1675]: time="2026-01-23T17:26:47.750629481Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 23 17:26:47.751001 kubelet[2900]: I0123 17:26:47.750836 2900 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 23 17:26:48.446735 systemd[1]: Created slice kubepods-besteffort-pod3d83f18a_82fa_4de5_b90c_8824e4143587.slice - libcontainer container kubepods-besteffort-pod3d83f18a_82fa_4de5_b90c_8824e4143587.slice. Jan 23 17:26:48.471394 kubelet[2900]: I0123 17:26:48.471360 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/3d83f18a-82fa-4de5-b90c-8824e4143587-kube-proxy\") pod \"kube-proxy-shjp4\" (UID: \"3d83f18a-82fa-4de5-b90c-8824e4143587\") " pod="kube-system/kube-proxy-shjp4" Jan 23 17:26:48.471394 kubelet[2900]: I0123 17:26:48.471429 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmdj2\" (UniqueName: \"kubernetes.io/projected/3d83f18a-82fa-4de5-b90c-8824e4143587-kube-api-access-xmdj2\") pod \"kube-proxy-shjp4\" (UID: \"3d83f18a-82fa-4de5-b90c-8824e4143587\") " pod="kube-system/kube-proxy-shjp4" Jan 23 17:26:48.471394 kubelet[2900]: I0123 17:26:48.471454 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3d83f18a-82fa-4de5-b90c-8824e4143587-xtables-lock\") pod \"kube-proxy-shjp4\" (UID: \"3d83f18a-82fa-4de5-b90c-8824e4143587\") " pod="kube-system/kube-proxy-shjp4" Jan 23 17:26:48.471394 kubelet[2900]: I0123 17:26:48.471468 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d83f18a-82fa-4de5-b90c-8824e4143587-lib-modules\") pod \"kube-proxy-shjp4\" (UID: \"3d83f18a-82fa-4de5-b90c-8824e4143587\") " pod="kube-system/kube-proxy-shjp4" Jan 23 17:26:48.759937 containerd[1675]: time="2026-01-23T17:26:48.759847152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-shjp4,Uid:3d83f18a-82fa-4de5-b90c-8824e4143587,Namespace:kube-system,Attempt:0,}" Jan 23 17:26:48.779549 containerd[1675]: time="2026-01-23T17:26:48.779425608Z" level=info msg="connecting to shim f34ca6f1cd055b65691b826c2b01d4e856f7bbcb0118dabe4a12e2a613486124" address="unix:///run/containerd/s/2beca1d2bacf8838ef8c16e52e270de4a157dcc703bf636f537a541ac49d7cac" namespace=k8s.io protocol=ttrpc version=3 Jan 23 17:26:48.806590 systemd[1]: Started cri-containerd-f34ca6f1cd055b65691b826c2b01d4e856f7bbcb0118dabe4a12e2a613486124.scope - libcontainer container f34ca6f1cd055b65691b826c2b01d4e856f7bbcb0118dabe4a12e2a613486124. Jan 23 17:26:48.816000 audit: BPF prog-id=133 op=LOAD Jan 23 17:26:48.818987 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 23 17:26:48.819065 kernel: audit: type=1334 audit(1769189208.816:436): prog-id=133 op=LOAD Jan 23 17:26:48.819132 kernel: audit: type=1334 audit(1769189208.817:437): prog-id=134 op=LOAD Jan 23 17:26:48.817000 audit: BPF prog-id=134 op=LOAD Jan 23 17:26:48.819748 kernel: audit: type=1300 audit(1769189208.817:437): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2961 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:48.817000 audit[2973]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2961 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:48.822730 kernel: audit: type=1327 audit(1769189208.817:437): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633346361366631636430353562363536393162383236633262303164 Jan 23 17:26:48.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633346361366631636430353562363536393162383236633262303164 Jan 23 17:26:48.817000 audit: BPF prog-id=134 op=UNLOAD Jan 23 17:26:48.826457 kernel: audit: type=1334 audit(1769189208.817:438): prog-id=134 op=UNLOAD Jan 23 17:26:48.826519 kernel: audit: type=1300 audit(1769189208.817:438): arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2961 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:48.817000 audit[2973]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2961 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:48.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633346361366631636430353562363536393162383236633262303164 Jan 23 17:26:48.832098 kernel: audit: type=1327 audit(1769189208.817:438): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633346361366631636430353562363536393162383236633262303164 Jan 23 17:26:48.832445 kernel: audit: type=1334 audit(1769189208.817:439): prog-id=135 op=LOAD Jan 23 17:26:48.817000 audit: BPF prog-id=135 op=LOAD Jan 23 17:26:48.817000 audit[2973]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2961 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:48.836009 kernel: audit: type=1300 audit(1769189208.817:439): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2961 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:48.836090 kernel: audit: type=1327 audit(1769189208.817:439): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633346361366631636430353562363536393162383236633262303164 Jan 23 17:26:48.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633346361366631636430353562363536393162383236633262303164 Jan 23 17:26:48.821000 audit: BPF prog-id=136 op=LOAD Jan 23 17:26:48.821000 audit[2973]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2961 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:48.821000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633346361366631636430353562363536393162383236633262303164 Jan 23 17:26:48.825000 audit: BPF prog-id=136 op=UNLOAD Jan 23 17:26:48.825000 audit[2973]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2961 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:48.825000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633346361366631636430353562363536393162383236633262303164 Jan 23 17:26:48.825000 audit: BPF prog-id=135 op=UNLOAD Jan 23 17:26:48.825000 audit[2973]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2961 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:48.825000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633346361366631636430353562363536393162383236633262303164 Jan 23 17:26:48.825000 audit: BPF prog-id=137 op=LOAD Jan 23 17:26:48.825000 audit[2973]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2961 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:48.825000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633346361366631636430353562363536393162383236633262303164 Jan 23 17:26:48.855713 containerd[1675]: time="2026-01-23T17:26:48.855513061Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-shjp4,Uid:3d83f18a-82fa-4de5-b90c-8824e4143587,Namespace:kube-system,Attempt:0,} returns sandbox id \"f34ca6f1cd055b65691b826c2b01d4e856f7bbcb0118dabe4a12e2a613486124\"" Jan 23 17:26:48.861513 containerd[1675]: time="2026-01-23T17:26:48.861436730Z" level=info msg="CreateContainer within sandbox \"f34ca6f1cd055b65691b826c2b01d4e856f7bbcb0118dabe4a12e2a613486124\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 23 17:26:48.872148 containerd[1675]: time="2026-01-23T17:26:48.872082062Z" level=info msg="Container 18fad396afb2f7181f2a0ab16cb58ea9e04a9b3ee936c54115a01b3468793e83: CDI devices from CRI Config.CDIDevices: []" Jan 23 17:26:48.879211 containerd[1675]: time="2026-01-23T17:26:48.879156737Z" level=info msg="CreateContainer within sandbox \"f34ca6f1cd055b65691b826c2b01d4e856f7bbcb0118dabe4a12e2a613486124\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"18fad396afb2f7181f2a0ab16cb58ea9e04a9b3ee936c54115a01b3468793e83\"" Jan 23 17:26:48.879831 containerd[1675]: time="2026-01-23T17:26:48.879803100Z" level=info msg="StartContainer for \"18fad396afb2f7181f2a0ab16cb58ea9e04a9b3ee936c54115a01b3468793e83\"" Jan 23 17:26:48.881384 containerd[1675]: time="2026-01-23T17:26:48.881353708Z" level=info msg="connecting to shim 18fad396afb2f7181f2a0ab16cb58ea9e04a9b3ee936c54115a01b3468793e83" address="unix:///run/containerd/s/2beca1d2bacf8838ef8c16e52e270de4a157dcc703bf636f537a541ac49d7cac" protocol=ttrpc version=3 Jan 23 17:26:48.901542 systemd[1]: Started cri-containerd-18fad396afb2f7181f2a0ab16cb58ea9e04a9b3ee936c54115a01b3468793e83.scope - libcontainer container 18fad396afb2f7181f2a0ab16cb58ea9e04a9b3ee936c54115a01b3468793e83. Jan 23 17:26:48.949000 audit: BPF prog-id=138 op=LOAD Jan 23 17:26:48.949000 audit[3000]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2961 pid=3000 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:48.949000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138666164333936616662326637313831663261306162313663623538 Jan 23 17:26:48.949000 audit: BPF prog-id=139 op=LOAD Jan 23 17:26:48.949000 audit[3000]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2961 pid=3000 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:48.949000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138666164333936616662326637313831663261306162313663623538 Jan 23 17:26:48.949000 audit: BPF prog-id=139 op=UNLOAD Jan 23 17:26:48.949000 audit[3000]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2961 pid=3000 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:48.949000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138666164333936616662326637313831663261306162313663623538 Jan 23 17:26:48.949000 audit: BPF prog-id=138 op=UNLOAD Jan 23 17:26:48.949000 audit[3000]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2961 pid=3000 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:48.949000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138666164333936616662326637313831663261306162313663623538 Jan 23 17:26:48.949000 audit: BPF prog-id=140 op=LOAD Jan 23 17:26:48.949000 audit[3000]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2961 pid=3000 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:48.949000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3138666164333936616662326637313831663261306162313663623538 Jan 23 17:26:48.961082 systemd[1]: Created slice kubepods-besteffort-poddc847d02_b19a_44d5_92d4_7ea7f8b8f467.slice - libcontainer container kubepods-besteffort-poddc847d02_b19a_44d5_92d4_7ea7f8b8f467.slice. Jan 23 17:26:48.973388 containerd[1675]: time="2026-01-23T17:26:48.973349079Z" level=info msg="StartContainer for \"18fad396afb2f7181f2a0ab16cb58ea9e04a9b3ee936c54115a01b3468793e83\" returns successfully" Jan 23 17:26:48.974893 kubelet[2900]: I0123 17:26:48.974857 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/dc847d02-b19a-44d5-92d4-7ea7f8b8f467-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-8gtkc\" (UID: \"dc847d02-b19a-44d5-92d4-7ea7f8b8f467\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-8gtkc" Jan 23 17:26:48.975248 kubelet[2900]: I0123 17:26:48.974900 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8w5f\" (UniqueName: \"kubernetes.io/projected/dc847d02-b19a-44d5-92d4-7ea7f8b8f467-kube-api-access-l8w5f\") pod \"tigera-operator-65cdcdfd6d-8gtkc\" (UID: \"dc847d02-b19a-44d5-92d4-7ea7f8b8f467\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-8gtkc" Jan 23 17:26:49.207000 audit[3066]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3066 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.207000 audit[3066]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffff923140 a2=0 a3=1 items=0 ppid=3013 pid=3066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.207000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 23 17:26:49.209000 audit[3067]: NETFILTER_CFG table=mangle:55 family=10 entries=1 op=nft_register_chain pid=3067 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.209000 audit[3067]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc0f73d10 a2=0 a3=1 items=0 ppid=3013 pid=3067 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.209000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 23 17:26:49.210000 audit[3069]: NETFILTER_CFG table=nat:56 family=2 entries=1 op=nft_register_chain pid=3069 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.210000 audit[3069]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffcc733110 a2=0 a3=1 items=0 ppid=3013 pid=3069 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.210000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 23 17:26:49.212000 audit[3072]: NETFILTER_CFG table=filter:57 family=2 entries=1 op=nft_register_chain pid=3072 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.212000 audit[3072]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc6653750 a2=0 a3=1 items=0 ppid=3013 pid=3072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.212000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 23 17:26:49.213000 audit[3073]: NETFILTER_CFG table=nat:58 family=10 entries=1 op=nft_register_chain pid=3073 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.213000 audit[3073]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffda429170 a2=0 a3=1 items=0 ppid=3013 pid=3073 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.213000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 23 17:26:49.215000 audit[3074]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3074 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.215000 audit[3074]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc6fefd20 a2=0 a3=1 items=0 ppid=3013 pid=3074 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.215000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 23 17:26:49.271663 containerd[1675]: time="2026-01-23T17:26:49.271602822Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-8gtkc,Uid:dc847d02-b19a-44d5-92d4-7ea7f8b8f467,Namespace:tigera-operator,Attempt:0,}" Jan 23 17:26:49.290119 containerd[1675]: time="2026-01-23T17:26:49.289967552Z" level=info msg="connecting to shim d5c6646efba872b11ae4414bfaf1e6dae26d443d19e42eb200b28671cec6198d" address="unix:///run/containerd/s/b64269228654a1785b714ce7ecd2180aa61dd27cf3b37a9263ea557170b2fcf3" namespace=k8s.io protocol=ttrpc version=3 Jan 23 17:26:49.309000 audit[3105]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3105 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.309000 audit[3105]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffe79a8560 a2=0 a3=1 items=0 ppid=3013 pid=3105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.309000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 23 17:26:49.311000 audit[3107]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3107 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.311000 audit[3107]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=fffffc13f470 a2=0 a3=1 items=0 ppid=3013 pid=3107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.311000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73002D Jan 23 17:26:49.313591 systemd[1]: Started cri-containerd-d5c6646efba872b11ae4414bfaf1e6dae26d443d19e42eb200b28671cec6198d.scope - libcontainer container d5c6646efba872b11ae4414bfaf1e6dae26d443d19e42eb200b28671cec6198d. Jan 23 17:26:49.315000 audit[3111]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3111 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.315000 audit[3111]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffdaa3acb0 a2=0 a3=1 items=0 ppid=3013 pid=3111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.315000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Jan 23 17:26:49.318000 audit[3116]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3116 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.318000 audit[3116]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff1a01500 a2=0 a3=1 items=0 ppid=3013 pid=3116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.318000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 23 17:26:49.320000 audit[3120]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3120 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.320000 audit[3120]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffce678410 a2=0 a3=1 items=0 ppid=3013 pid=3120 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.320000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 23 17:26:49.321000 audit[3121]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3121 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.321000 audit[3121]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff2be8000 a2=0 a3=1 items=0 ppid=3013 pid=3121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.321000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Jan 23 17:26:49.325000 audit: BPF prog-id=141 op=LOAD Jan 23 17:26:49.325000 audit[3123]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3123 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.325000 audit[3123]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffd6ec5e10 a2=0 a3=1 items=0 ppid=3013 pid=3123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.325000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 17:26:49.326000 audit: BPF prog-id=142 op=LOAD Jan 23 17:26:49.326000 audit[3093]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3083 pid=3093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435633636343665666261383732623131616534343134626661663165 Jan 23 17:26:49.326000 audit: BPF prog-id=142 op=UNLOAD Jan 23 17:26:49.326000 audit[3093]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3083 pid=3093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435633636343665666261383732623131616534343134626661663165 Jan 23 17:26:49.326000 audit: BPF prog-id=143 op=LOAD Jan 23 17:26:49.326000 audit[3093]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3083 pid=3093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435633636343665666261383732623131616534343134626661663165 Jan 23 17:26:49.326000 audit: BPF prog-id=144 op=LOAD Jan 23 17:26:49.326000 audit[3093]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3083 pid=3093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435633636343665666261383732623131616534343134626661663165 Jan 23 17:26:49.326000 audit: BPF prog-id=144 op=UNLOAD Jan 23 17:26:49.326000 audit[3093]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3083 pid=3093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435633636343665666261383732623131616534343134626661663165 Jan 23 17:26:49.326000 audit: BPF prog-id=143 op=UNLOAD Jan 23 17:26:49.326000 audit[3093]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3083 pid=3093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435633636343665666261383732623131616534343134626661663165 Jan 23 17:26:49.326000 audit: BPF prog-id=145 op=LOAD Jan 23 17:26:49.326000 audit[3093]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3083 pid=3093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6435633636343665666261383732623131616534343134626661663165 Jan 23 17:26:49.330000 audit[3126]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3126 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.330000 audit[3126]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffd1bab3a0 a2=0 a3=1 items=0 ppid=3013 pid=3126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.330000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 17:26:49.332000 audit[3127]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3127 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.332000 audit[3127]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffed3fda90 a2=0 a3=1 items=0 ppid=3013 pid=3127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.332000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Jan 23 17:26:49.335000 audit[3129]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3129 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.335000 audit[3129]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffef929160 a2=0 a3=1 items=0 ppid=3013 pid=3129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.335000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 23 17:26:49.336000 audit[3130]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3130 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.336000 audit[3130]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffffcbc4750 a2=0 a3=1 items=0 ppid=3013 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.336000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 23 17:26:49.340000 audit[3132]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3132 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.340000 audit[3132]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffffe8fa290 a2=0 a3=1 items=0 ppid=3013 pid=3132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.340000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F5859 Jan 23 17:26:49.344000 audit[3140]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3140 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.344000 audit[3140]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd57d2910 a2=0 a3=1 items=0 ppid=3013 pid=3140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.344000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Jan 23 17:26:49.349000 audit[3145]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3145 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.349000 audit[3145]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffdc507ad0 a2=0 a3=1 items=0 ppid=3013 pid=3145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.349000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Jan 23 17:26:49.351301 containerd[1675]: time="2026-01-23T17:26:49.351260693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-8gtkc,Uid:dc847d02-b19a-44d5-92d4-7ea7f8b8f467,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"d5c6646efba872b11ae4414bfaf1e6dae26d443d19e42eb200b28671cec6198d\"" Jan 23 17:26:49.351000 audit[3146]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3146 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.351000 audit[3146]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffdd681200 a2=0 a3=1 items=0 ppid=3013 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.351000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Jan 23 17:26:49.354000 audit[3148]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3148 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.354000 audit[3148]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffeed9eb50 a2=0 a3=1 items=0 ppid=3013 pid=3148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.354000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 17:26:49.354886 containerd[1675]: time="2026-01-23T17:26:49.354484229Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 23 17:26:49.358000 audit[3151]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3151 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.358000 audit[3151]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff623cd50 a2=0 a3=1 items=0 ppid=3013 pid=3151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.358000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 17:26:49.359000 audit[3152]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3152 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.359000 audit[3152]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffebeb5d40 a2=0 a3=1 items=0 ppid=3013 pid=3152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.359000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 23 17:26:49.362000 audit[3154]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3154 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 23 17:26:49.362000 audit[3154]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=ffffcba8dc00 a2=0 a3=1 items=0 ppid=3013 pid=3154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.362000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 23 17:26:49.393000 audit[3160]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3160 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:26:49.393000 audit[3160]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffc3fccf10 a2=0 a3=1 items=0 ppid=3013 pid=3160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.393000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:26:49.405000 audit[3160]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3160 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:26:49.405000 audit[3160]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=ffffc3fccf10 a2=0 a3=1 items=0 ppid=3013 pid=3160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.405000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:26:49.406000 audit[3165]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3165 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.406000 audit[3165]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffe94a0770 a2=0 a3=1 items=0 ppid=3013 pid=3165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.406000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 23 17:26:49.409000 audit[3167]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3167 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.409000 audit[3167]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=fffffeaea730 a2=0 a3=1 items=0 ppid=3013 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.409000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Jan 23 17:26:49.413000 audit[3170]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3170 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.413000 audit[3170]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=fffffa031d30 a2=0 a3=1 items=0 ppid=3013 pid=3170 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.413000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C Jan 23 17:26:49.414000 audit[3171]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3171 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.414000 audit[3171]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd989ef30 a2=0 a3=1 items=0 ppid=3013 pid=3171 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.414000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 23 17:26:49.417000 audit[3173]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3173 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.417000 audit[3173]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff8892e70 a2=0 a3=1 items=0 ppid=3013 pid=3173 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.417000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 23 17:26:49.418000 audit[3174]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3174 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.418000 audit[3174]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff807faa0 a2=0 a3=1 items=0 ppid=3013 pid=3174 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.418000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Jan 23 17:26:49.421000 audit[3176]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3176 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.421000 audit[3176]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffe9f80130 a2=0 a3=1 items=0 ppid=3013 pid=3176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.421000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 17:26:49.425000 audit[3179]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3179 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.425000 audit[3179]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=ffffe3cb3fe0 a2=0 a3=1 items=0 ppid=3013 pid=3179 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.425000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 17:26:49.426000 audit[3180]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3180 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.426000 audit[3180]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd07704c0 a2=0 a3=1 items=0 ppid=3013 pid=3180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.426000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Jan 23 17:26:49.428000 audit[3182]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3182 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.428000 audit[3182]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffcf61b5a0 a2=0 a3=1 items=0 ppid=3013 pid=3182 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.428000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 23 17:26:49.430000 audit[3183]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3183 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.430000 audit[3183]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffebc938a0 a2=0 a3=1 items=0 ppid=3013 pid=3183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.430000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 23 17:26:49.432000 audit[3185]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3185 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.432000 audit[3185]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffebe65440 a2=0 a3=1 items=0 ppid=3013 pid=3185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.432000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Jan 23 17:26:49.435000 audit[3188]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3188 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.435000 audit[3188]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffcd95e8d0 a2=0 a3=1 items=0 ppid=3013 pid=3188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.435000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Jan 23 17:26:49.439000 audit[3191]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3191 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.439000 audit[3191]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd55876b0 a2=0 a3=1 items=0 ppid=3013 pid=3191 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.439000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D5052 Jan 23 17:26:49.440000 audit[3192]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3192 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.440000 audit[3192]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffff3bd51b0 a2=0 a3=1 items=0 ppid=3013 pid=3192 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.440000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Jan 23 17:26:49.442000 audit[3194]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3194 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.442000 audit[3194]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffd364f040 a2=0 a3=1 items=0 ppid=3013 pid=3194 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.442000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 17:26:49.445000 audit[3197]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3197 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.445000 audit[3197]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffd56cf210 a2=0 a3=1 items=0 ppid=3013 pid=3197 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.445000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 23 17:26:49.447000 audit[3198]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3198 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.447000 audit[3198]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe49b78a0 a2=0 a3=1 items=0 ppid=3013 pid=3198 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.447000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 23 17:26:49.449000 audit[3200]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3200 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.449000 audit[3200]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=fffff5f259b0 a2=0 a3=1 items=0 ppid=3013 pid=3200 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.449000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 23 17:26:49.450000 audit[3201]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3201 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.450000 audit[3201]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffee61c9e0 a2=0 a3=1 items=0 ppid=3013 pid=3201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.450000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 23 17:26:49.452000 audit[3203]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3203 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.452000 audit[3203]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffdc3dea00 a2=0 a3=1 items=0 ppid=3013 pid=3203 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.452000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 23 17:26:49.456000 audit[3206]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3206 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 23 17:26:49.456000 audit[3206]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffd995f2c0 a2=0 a3=1 items=0 ppid=3013 pid=3206 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.456000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 23 17:26:49.459000 audit[3208]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3208 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 23 17:26:49.459000 audit[3208]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2088 a0=3 a1=fffff9d63740 a2=0 a3=1 items=0 ppid=3013 pid=3208 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.459000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:26:49.460000 audit[3208]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3208 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 23 17:26:49.460000 audit[3208]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=fffff9d63740 a2=0 a3=1 items=0 ppid=3013 pid=3208 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:49.460000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:26:49.605272 kubelet[2900]: I0123 17:26:49.605216 2900 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-shjp4" podStartSLOduration=1.605201059 podStartE2EDuration="1.605201059s" podCreationTimestamp="2026-01-23 17:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 17:26:49.605053018 +0000 UTC m=+7.140596750" watchObservedRunningTime="2026-01-23 17:26:49.605201059 +0000 UTC m=+7.140744751" Jan 23 17:26:50.822609 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1494500349.mount: Deactivated successfully. Jan 23 17:26:51.471242 containerd[1675]: time="2026-01-23T17:26:51.470875011Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:51.472276 containerd[1675]: time="2026-01-23T17:26:51.472073137Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=20773434" Jan 23 17:26:51.473193 containerd[1675]: time="2026-01-23T17:26:51.473161783Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:51.475689 containerd[1675]: time="2026-01-23T17:26:51.475648195Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:26:51.476506 containerd[1675]: time="2026-01-23T17:26:51.476478839Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 2.12196065s" Jan 23 17:26:51.476584 containerd[1675]: time="2026-01-23T17:26:51.476569839Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Jan 23 17:26:51.481223 containerd[1675]: time="2026-01-23T17:26:51.481181502Z" level=info msg="CreateContainer within sandbox \"d5c6646efba872b11ae4414bfaf1e6dae26d443d19e42eb200b28671cec6198d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 23 17:26:51.489653 containerd[1675]: time="2026-01-23T17:26:51.488854219Z" level=info msg="Container 64fdb14b02771a41d2c5ca95d13c1a2764265d05f896186cb5140b274582ca06: CDI devices from CRI Config.CDIDevices: []" Jan 23 17:26:51.490461 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1852703040.mount: Deactivated successfully. Jan 23 17:26:51.497936 containerd[1675]: time="2026-01-23T17:26:51.497877184Z" level=info msg="CreateContainer within sandbox \"d5c6646efba872b11ae4414bfaf1e6dae26d443d19e42eb200b28671cec6198d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"64fdb14b02771a41d2c5ca95d13c1a2764265d05f896186cb5140b274582ca06\"" Jan 23 17:26:51.498633 containerd[1675]: time="2026-01-23T17:26:51.498413386Z" level=info msg="StartContainer for \"64fdb14b02771a41d2c5ca95d13c1a2764265d05f896186cb5140b274582ca06\"" Jan 23 17:26:51.499466 containerd[1675]: time="2026-01-23T17:26:51.499431551Z" level=info msg="connecting to shim 64fdb14b02771a41d2c5ca95d13c1a2764265d05f896186cb5140b274582ca06" address="unix:///run/containerd/s/b64269228654a1785b714ce7ecd2180aa61dd27cf3b37a9263ea557170b2fcf3" protocol=ttrpc version=3 Jan 23 17:26:51.525726 systemd[1]: Started cri-containerd-64fdb14b02771a41d2c5ca95d13c1a2764265d05f896186cb5140b274582ca06.scope - libcontainer container 64fdb14b02771a41d2c5ca95d13c1a2764265d05f896186cb5140b274582ca06. Jan 23 17:26:51.535000 audit: BPF prog-id=146 op=LOAD Jan 23 17:26:51.536000 audit: BPF prog-id=147 op=LOAD Jan 23 17:26:51.536000 audit[3217]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3083 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:51.536000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634666462313462303237373161343164326335636139356431336331 Jan 23 17:26:51.536000 audit: BPF prog-id=147 op=UNLOAD Jan 23 17:26:51.536000 audit[3217]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3083 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:51.536000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634666462313462303237373161343164326335636139356431336331 Jan 23 17:26:51.536000 audit: BPF prog-id=148 op=LOAD Jan 23 17:26:51.536000 audit[3217]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3083 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:51.536000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634666462313462303237373161343164326335636139356431336331 Jan 23 17:26:51.536000 audit: BPF prog-id=149 op=LOAD Jan 23 17:26:51.536000 audit[3217]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3083 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:51.536000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634666462313462303237373161343164326335636139356431336331 Jan 23 17:26:51.536000 audit: BPF prog-id=149 op=UNLOAD Jan 23 17:26:51.536000 audit[3217]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3083 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:51.536000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634666462313462303237373161343164326335636139356431336331 Jan 23 17:26:51.536000 audit: BPF prog-id=148 op=UNLOAD Jan 23 17:26:51.536000 audit[3217]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3083 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:51.536000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634666462313462303237373161343164326335636139356431336331 Jan 23 17:26:51.537000 audit: BPF prog-id=150 op=LOAD Jan 23 17:26:51.537000 audit[3217]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3083 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:51.537000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634666462313462303237373161343164326335636139356431336331 Jan 23 17:26:51.552337 containerd[1675]: time="2026-01-23T17:26:51.552278651Z" level=info msg="StartContainer for \"64fdb14b02771a41d2c5ca95d13c1a2764265d05f896186cb5140b274582ca06\" returns successfully" Jan 23 17:26:51.624333 kubelet[2900]: I0123 17:26:51.624171 2900 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-8gtkc" podStartSLOduration=1.5007201860000001 podStartE2EDuration="3.624157243s" podCreationTimestamp="2026-01-23 17:26:48 +0000 UTC" firstStartedPulling="2026-01-23 17:26:49.353908906 +0000 UTC m=+6.889452558" lastFinishedPulling="2026-01-23 17:26:51.477345963 +0000 UTC m=+9.012889615" observedRunningTime="2026-01-23 17:26:51.624063843 +0000 UTC m=+9.159607535" watchObservedRunningTime="2026-01-23 17:26:51.624157243 +0000 UTC m=+9.159700935" Jan 23 17:26:56.848717 sudo[1943]: pam_unix(sudo:session): session closed for user root Jan 23 17:26:56.848000 audit[1943]: USER_END pid=1943 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 17:26:56.850448 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 23 17:26:56.850475 kernel: audit: type=1106 audit(1769189216.848:516): pid=1943 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 17:26:56.848000 audit[1943]: CRED_DISP pid=1943 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 17:26:56.854156 kernel: audit: type=1104 audit(1769189216.848:517): pid=1943 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 23 17:26:56.944348 sshd[1942]: Connection closed by 4.153.228.146 port 57988 Jan 23 17:26:56.943996 sshd-session[1938]: pam_unix(sshd:session): session closed for user core Jan 23 17:26:56.945000 audit[1938]: USER_END pid=1938 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:26:56.952424 systemd[1]: sshd@6-10.0.1.37:22-4.153.228.146:57988.service: Deactivated successfully. Jan 23 17:26:56.949000 audit[1938]: CRED_DISP pid=1938 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:26:56.955242 kernel: audit: type=1106 audit(1769189216.945:518): pid=1938 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:26:56.955298 kernel: audit: type=1104 audit(1769189216.949:519): pid=1938 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:26:56.954000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.1.37:22-4.153.228.146:57988 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:56.957956 systemd[1]: session-8.scope: Deactivated successfully. Jan 23 17:26:56.958211 systemd[1]: session-8.scope: Consumed 7.163s CPU time, 221.9M memory peak. Jan 23 17:26:56.959118 kernel: audit: type=1131 audit(1769189216.954:520): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.1.37:22-4.153.228.146:57988 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:26:56.962730 systemd-logind[1644]: Session 8 logged out. Waiting for processes to exit. Jan 23 17:26:56.963605 systemd-logind[1644]: Removed session 8. Jan 23 17:26:57.658000 audit[3307]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3307 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:26:57.658000 audit[3307]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffda048b50 a2=0 a3=1 items=0 ppid=3013 pid=3307 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:57.665936 kernel: audit: type=1325 audit(1769189217.658:521): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3307 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:26:57.666054 kernel: audit: type=1300 audit(1769189217.658:521): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffda048b50 a2=0 a3=1 items=0 ppid=3013 pid=3307 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:57.658000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:26:57.670347 kernel: audit: type=1327 audit(1769189217.658:521): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:26:57.668000 audit[3307]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3307 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:26:57.672828 kernel: audit: type=1325 audit(1769189217.668:522): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3307 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:26:57.668000 audit[3307]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffda048b50 a2=0 a3=1 items=0 ppid=3013 pid=3307 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:57.668000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:26:57.683280 kernel: audit: type=1300 audit(1769189217.668:522): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffda048b50 a2=0 a3=1 items=0 ppid=3013 pid=3307 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:58.684000 audit[3309]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3309 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:26:58.684000 audit[3309]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=fffff9bc9a00 a2=0 a3=1 items=0 ppid=3013 pid=3309 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:58.684000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:26:58.691000 audit[3309]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3309 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:26:58.691000 audit[3309]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff9bc9a00 a2=0 a3=1 items=0 ppid=3013 pid=3309 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:26:58.691000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:02.554000 audit[3311]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3311 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:02.556974 kernel: kauditd_printk_skb: 7 callbacks suppressed Jan 23 17:27:02.557141 kernel: audit: type=1325 audit(1769189222.554:525): table=filter:109 family=2 entries=17 op=nft_register_rule pid=3311 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:02.557163 kernel: audit: type=1300 audit(1769189222.554:525): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffd83a8970 a2=0 a3=1 items=0 ppid=3013 pid=3311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:02.554000 audit[3311]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffd83a8970 a2=0 a3=1 items=0 ppid=3013 pid=3311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:02.554000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:02.562541 kernel: audit: type=1327 audit(1769189222.554:525): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:02.564000 audit[3311]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3311 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:02.564000 audit[3311]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd83a8970 a2=0 a3=1 items=0 ppid=3013 pid=3311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:02.569767 kernel: audit: type=1325 audit(1769189222.564:526): table=nat:110 family=2 entries=12 op=nft_register_rule pid=3311 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:02.569812 kernel: audit: type=1300 audit(1769189222.564:526): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd83a8970 a2=0 a3=1 items=0 ppid=3013 pid=3311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:02.564000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:02.571413 kernel: audit: type=1327 audit(1769189222.564:526): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:03.580000 audit[3313]: NETFILTER_CFG table=filter:111 family=2 entries=19 op=nft_register_rule pid=3313 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:03.580000 audit[3313]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffc71291c0 a2=0 a3=1 items=0 ppid=3013 pid=3313 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:03.585506 kernel: audit: type=1325 audit(1769189223.580:527): table=filter:111 family=2 entries=19 op=nft_register_rule pid=3313 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:03.585550 kernel: audit: type=1300 audit(1769189223.580:527): arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffc71291c0 a2=0 a3=1 items=0 ppid=3013 pid=3313 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:03.585574 kernel: audit: type=1327 audit(1769189223.580:527): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:03.580000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:03.592000 audit[3313]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3313 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:03.592000 audit[3313]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc71291c0 a2=0 a3=1 items=0 ppid=3013 pid=3313 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:03.592000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:03.595347 kernel: audit: type=1325 audit(1769189223.592:528): table=nat:112 family=2 entries=12 op=nft_register_rule pid=3313 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:05.112000 audit[3315]: NETFILTER_CFG table=filter:113 family=2 entries=21 op=nft_register_rule pid=3315 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:05.112000 audit[3315]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=fffff616d3d0 a2=0 a3=1 items=0 ppid=3013 pid=3315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:05.112000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:05.120000 audit[3315]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3315 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:05.120000 audit[3315]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff616d3d0 a2=0 a3=1 items=0 ppid=3013 pid=3315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:05.120000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:05.154562 systemd[1]: Created slice kubepods-besteffort-pod4562a804_91ca_4b9e_962b_e8ef9fd6e802.slice - libcontainer container kubepods-besteffort-pod4562a804_91ca_4b9e_962b_e8ef9fd6e802.slice. Jan 23 17:27:05.177878 kubelet[2900]: I0123 17:27:05.177839 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x7td\" (UniqueName: \"kubernetes.io/projected/4562a804-91ca-4b9e-962b-e8ef9fd6e802-kube-api-access-9x7td\") pod \"calico-typha-5776bd4fb5-gnfzm\" (UID: \"4562a804-91ca-4b9e-962b-e8ef9fd6e802\") " pod="calico-system/calico-typha-5776bd4fb5-gnfzm" Jan 23 17:27:05.178273 kubelet[2900]: I0123 17:27:05.178223 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4562a804-91ca-4b9e-962b-e8ef9fd6e802-tigera-ca-bundle\") pod \"calico-typha-5776bd4fb5-gnfzm\" (UID: \"4562a804-91ca-4b9e-962b-e8ef9fd6e802\") " pod="calico-system/calico-typha-5776bd4fb5-gnfzm" Jan 23 17:27:05.178273 kubelet[2900]: I0123 17:27:05.178253 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4562a804-91ca-4b9e-962b-e8ef9fd6e802-typha-certs\") pod \"calico-typha-5776bd4fb5-gnfzm\" (UID: \"4562a804-91ca-4b9e-962b-e8ef9fd6e802\") " pod="calico-system/calico-typha-5776bd4fb5-gnfzm" Jan 23 17:27:05.340897 systemd[1]: Created slice kubepods-besteffort-pode36568b0_f45f_48dc_aac1_0c1aa7db269a.slice - libcontainer container kubepods-besteffort-pode36568b0_f45f_48dc_aac1_0c1aa7db269a.slice. Jan 23 17:27:05.379723 kubelet[2900]: I0123 17:27:05.379594 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/e36568b0-f45f-48dc-aac1-0c1aa7db269a-flexvol-driver-host\") pod \"calico-node-km4sx\" (UID: \"e36568b0-f45f-48dc-aac1-0c1aa7db269a\") " pod="calico-system/calico-node-km4sx" Jan 23 17:27:05.379723 kubelet[2900]: I0123 17:27:05.379638 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e36568b0-f45f-48dc-aac1-0c1aa7db269a-lib-modules\") pod \"calico-node-km4sx\" (UID: \"e36568b0-f45f-48dc-aac1-0c1aa7db269a\") " pod="calico-system/calico-node-km4sx" Jan 23 17:27:05.379723 kubelet[2900]: I0123 17:27:05.379656 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e36568b0-f45f-48dc-aac1-0c1aa7db269a-var-lib-calico\") pod \"calico-node-km4sx\" (UID: \"e36568b0-f45f-48dc-aac1-0c1aa7db269a\") " pod="calico-system/calico-node-km4sx" Jan 23 17:27:05.379723 kubelet[2900]: I0123 17:27:05.379672 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/e36568b0-f45f-48dc-aac1-0c1aa7db269a-cni-net-dir\") pod \"calico-node-km4sx\" (UID: \"e36568b0-f45f-48dc-aac1-0c1aa7db269a\") " pod="calico-system/calico-node-km4sx" Jan 23 17:27:05.379723 kubelet[2900]: I0123 17:27:05.379718 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rht82\" (UniqueName: \"kubernetes.io/projected/e36568b0-f45f-48dc-aac1-0c1aa7db269a-kube-api-access-rht82\") pod \"calico-node-km4sx\" (UID: \"e36568b0-f45f-48dc-aac1-0c1aa7db269a\") " pod="calico-system/calico-node-km4sx" Jan 23 17:27:05.379906 kubelet[2900]: I0123 17:27:05.379771 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/e36568b0-f45f-48dc-aac1-0c1aa7db269a-var-run-calico\") pod \"calico-node-km4sx\" (UID: \"e36568b0-f45f-48dc-aac1-0c1aa7db269a\") " pod="calico-system/calico-node-km4sx" Jan 23 17:27:05.379906 kubelet[2900]: I0123 17:27:05.379788 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/e36568b0-f45f-48dc-aac1-0c1aa7db269a-cni-log-dir\") pod \"calico-node-km4sx\" (UID: \"e36568b0-f45f-48dc-aac1-0c1aa7db269a\") " pod="calico-system/calico-node-km4sx" Jan 23 17:27:05.379906 kubelet[2900]: I0123 17:27:05.379820 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e36568b0-f45f-48dc-aac1-0c1aa7db269a-tigera-ca-bundle\") pod \"calico-node-km4sx\" (UID: \"e36568b0-f45f-48dc-aac1-0c1aa7db269a\") " pod="calico-system/calico-node-km4sx" Jan 23 17:27:05.379906 kubelet[2900]: I0123 17:27:05.379855 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e36568b0-f45f-48dc-aac1-0c1aa7db269a-xtables-lock\") pod \"calico-node-km4sx\" (UID: \"e36568b0-f45f-48dc-aac1-0c1aa7db269a\") " pod="calico-system/calico-node-km4sx" Jan 23 17:27:05.379906 kubelet[2900]: I0123 17:27:05.379881 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/e36568b0-f45f-48dc-aac1-0c1aa7db269a-node-certs\") pod \"calico-node-km4sx\" (UID: \"e36568b0-f45f-48dc-aac1-0c1aa7db269a\") " pod="calico-system/calico-node-km4sx" Jan 23 17:27:05.379997 kubelet[2900]: I0123 17:27:05.379896 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/e36568b0-f45f-48dc-aac1-0c1aa7db269a-policysync\") pod \"calico-node-km4sx\" (UID: \"e36568b0-f45f-48dc-aac1-0c1aa7db269a\") " pod="calico-system/calico-node-km4sx" Jan 23 17:27:05.379997 kubelet[2900]: I0123 17:27:05.379916 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/e36568b0-f45f-48dc-aac1-0c1aa7db269a-cni-bin-dir\") pod \"calico-node-km4sx\" (UID: \"e36568b0-f45f-48dc-aac1-0c1aa7db269a\") " pod="calico-system/calico-node-km4sx" Jan 23 17:27:05.461323 containerd[1675]: time="2026-01-23T17:27:05.461224769Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5776bd4fb5-gnfzm,Uid:4562a804-91ca-4b9e-962b-e8ef9fd6e802,Namespace:calico-system,Attempt:0,}" Jan 23 17:27:05.481918 kubelet[2900]: E0123 17:27:05.481770 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.481918 kubelet[2900]: W0123 17:27:05.481822 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.481918 kubelet[2900]: E0123 17:27:05.481853 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.482097 kubelet[2900]: E0123 17:27:05.482036 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.482097 kubelet[2900]: W0123 17:27:05.482050 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.482097 kubelet[2900]: E0123 17:27:05.482059 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.482747 kubelet[2900]: E0123 17:27:05.482263 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.482747 kubelet[2900]: W0123 17:27:05.482274 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.482747 kubelet[2900]: E0123 17:27:05.482282 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.483241 kubelet[2900]: E0123 17:27:05.483221 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.483384 kubelet[2900]: W0123 17:27:05.483367 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.483486 kubelet[2900]: E0123 17:27:05.483470 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.489168 kubelet[2900]: E0123 17:27:05.489129 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.489168 kubelet[2900]: W0123 17:27:05.489153 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.489168 kubelet[2900]: E0123 17:27:05.489173 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.489486 containerd[1675]: time="2026-01-23T17:27:05.489444747Z" level=info msg="connecting to shim 8dadaaadcade86e4b4307225bee26a7e1928c4d934c60ccc01b3fa9204aaddc7" address="unix:///run/containerd/s/452a936a08e778997a0b6f3e2eb077a100d18f032483bbd551b397d33316f5c5" namespace=k8s.io protocol=ttrpc version=3 Jan 23 17:27:05.499386 kubelet[2900]: E0123 17:27:05.499222 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.499386 kubelet[2900]: W0123 17:27:05.499243 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.499386 kubelet[2900]: E0123 17:27:05.499266 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.525604 systemd[1]: Started cri-containerd-8dadaaadcade86e4b4307225bee26a7e1928c4d934c60ccc01b3fa9204aaddc7.scope - libcontainer container 8dadaaadcade86e4b4307225bee26a7e1928c4d934c60ccc01b3fa9204aaddc7. Jan 23 17:27:05.537797 kubelet[2900]: E0123 17:27:05.537744 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:27:05.549000 audit: BPF prog-id=151 op=LOAD Jan 23 17:27:05.552000 audit: BPF prog-id=152 op=LOAD Jan 23 17:27:05.552000 audit[3345]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3333 pid=3345 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:05.552000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864616461616164636164653836653462343330373232356265653236 Jan 23 17:27:05.552000 audit: BPF prog-id=152 op=UNLOAD Jan 23 17:27:05.552000 audit[3345]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3333 pid=3345 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:05.552000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864616461616164636164653836653462343330373232356265653236 Jan 23 17:27:05.553000 audit: BPF prog-id=153 op=LOAD Jan 23 17:27:05.553000 audit[3345]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3333 pid=3345 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:05.553000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864616461616164636164653836653462343330373232356265653236 Jan 23 17:27:05.553000 audit: BPF prog-id=154 op=LOAD Jan 23 17:27:05.553000 audit[3345]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3333 pid=3345 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:05.553000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864616461616164636164653836653462343330373232356265653236 Jan 23 17:27:05.553000 audit: BPF prog-id=154 op=UNLOAD Jan 23 17:27:05.553000 audit[3345]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3333 pid=3345 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:05.553000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864616461616164636164653836653462343330373232356265653236 Jan 23 17:27:05.553000 audit: BPF prog-id=153 op=UNLOAD Jan 23 17:27:05.553000 audit[3345]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3333 pid=3345 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:05.553000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864616461616164636164653836653462343330373232356265653236 Jan 23 17:27:05.553000 audit: BPF prog-id=155 op=LOAD Jan 23 17:27:05.553000 audit[3345]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3333 pid=3345 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:05.553000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864616461616164636164653836653462343330373232356265653236 Jan 23 17:27:05.569861 kubelet[2900]: E0123 17:27:05.569802 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.569861 kubelet[2900]: W0123 17:27:05.569826 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.569861 kubelet[2900]: E0123 17:27:05.569845 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.570458 kubelet[2900]: E0123 17:27:05.570442 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.570495 kubelet[2900]: W0123 17:27:05.570458 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.570532 kubelet[2900]: E0123 17:27:05.570499 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.571053 kubelet[2900]: E0123 17:27:05.571036 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.571053 kubelet[2900]: W0123 17:27:05.571052 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.571185 kubelet[2900]: E0123 17:27:05.571064 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.572372 kubelet[2900]: E0123 17:27:05.572346 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.572372 kubelet[2900]: W0123 17:27:05.572362 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.572372 kubelet[2900]: E0123 17:27:05.572373 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.572993 kubelet[2900]: E0123 17:27:05.572971 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.572993 kubelet[2900]: W0123 17:27:05.572986 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.573079 kubelet[2900]: E0123 17:27:05.572997 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.573565 kubelet[2900]: E0123 17:27:05.573546 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.573565 kubelet[2900]: W0123 17:27:05.573559 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.573565 kubelet[2900]: E0123 17:27:05.573570 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.575649 kubelet[2900]: E0123 17:27:05.575631 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.576071 kubelet[2900]: W0123 17:27:05.575718 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.576201 kubelet[2900]: E0123 17:27:05.576138 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.576826 kubelet[2900]: E0123 17:27:05.576801 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.577824 kubelet[2900]: W0123 17:27:05.577796 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.577934 kubelet[2900]: E0123 17:27:05.577920 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.578770 kubelet[2900]: E0123 17:27:05.578654 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.578967 kubelet[2900]: W0123 17:27:05.578945 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.579242 kubelet[2900]: E0123 17:27:05.579079 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.580777 kubelet[2900]: E0123 17:27:05.580561 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.580777 kubelet[2900]: W0123 17:27:05.580581 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.580777 kubelet[2900]: E0123 17:27:05.580599 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.582185 kubelet[2900]: E0123 17:27:05.582151 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.582489 kubelet[2900]: W0123 17:27:05.582402 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.582489 kubelet[2900]: E0123 17:27:05.582429 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.582770 kubelet[2900]: E0123 17:27:05.582756 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.582833 kubelet[2900]: W0123 17:27:05.582822 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.582898 kubelet[2900]: E0123 17:27:05.582887 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.584661 kubelet[2900]: E0123 17:27:05.584498 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.584661 kubelet[2900]: W0123 17:27:05.584517 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.584661 kubelet[2900]: E0123 17:27:05.584530 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.585071 kubelet[2900]: E0123 17:27:05.585053 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.585365 kubelet[2900]: W0123 17:27:05.585344 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.586356 kubelet[2900]: E0123 17:27:05.586333 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.586769 kubelet[2900]: E0123 17:27:05.586664 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.586769 kubelet[2900]: W0123 17:27:05.586678 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.586769 kubelet[2900]: E0123 17:27:05.586689 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.586928 kubelet[2900]: E0123 17:27:05.586915 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.586982 kubelet[2900]: W0123 17:27:05.586971 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.587345 kubelet[2900]: E0123 17:27:05.587028 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.589655 kubelet[2900]: E0123 17:27:05.589559 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.589655 kubelet[2900]: W0123 17:27:05.589578 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.589655 kubelet[2900]: E0123 17:27:05.589590 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.590384 kubelet[2900]: E0123 17:27:05.590364 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.590384 kubelet[2900]: W0123 17:27:05.590380 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.590750 kubelet[2900]: E0123 17:27:05.590393 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.590750 kubelet[2900]: E0123 17:27:05.590587 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.590750 kubelet[2900]: W0123 17:27:05.590597 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.590750 kubelet[2900]: E0123 17:27:05.590605 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.591347 kubelet[2900]: E0123 17:27:05.590832 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.591347 kubelet[2900]: W0123 17:27:05.590849 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.591347 kubelet[2900]: E0123 17:27:05.590860 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.591851 kubelet[2900]: E0123 17:27:05.591829 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.591851 kubelet[2900]: W0123 17:27:05.591845 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.592469 kubelet[2900]: E0123 17:27:05.591858 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.592469 kubelet[2900]: I0123 17:27:05.591885 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baa2fc5a-f231-4d23-992e-37e27c865a7c-kubelet-dir\") pod \"csi-node-driver-p68bw\" (UID: \"baa2fc5a-f231-4d23-992e-37e27c865a7c\") " pod="calico-system/csi-node-driver-p68bw" Jan 23 17:27:05.592469 kubelet[2900]: E0123 17:27:05.592074 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.592469 kubelet[2900]: W0123 17:27:05.592083 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.592469 kubelet[2900]: E0123 17:27:05.592091 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.592469 kubelet[2900]: I0123 17:27:05.592112 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/baa2fc5a-f231-4d23-992e-37e27c865a7c-registration-dir\") pod \"csi-node-driver-p68bw\" (UID: \"baa2fc5a-f231-4d23-992e-37e27c865a7c\") " pod="calico-system/csi-node-driver-p68bw" Jan 23 17:27:05.593188 kubelet[2900]: E0123 17:27:05.592950 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.593188 kubelet[2900]: W0123 17:27:05.593083 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.593401 kubelet[2900]: E0123 17:27:05.593232 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.594098 kubelet[2900]: E0123 17:27:05.594037 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.594098 kubelet[2900]: W0123 17:27:05.594053 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.594098 kubelet[2900]: E0123 17:27:05.594069 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.594516 kubelet[2900]: E0123 17:27:05.594388 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.594516 kubelet[2900]: W0123 17:27:05.594404 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.594516 kubelet[2900]: E0123 17:27:05.594415 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.594516 kubelet[2900]: I0123 17:27:05.594465 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/baa2fc5a-f231-4d23-992e-37e27c865a7c-socket-dir\") pod \"csi-node-driver-p68bw\" (UID: \"baa2fc5a-f231-4d23-992e-37e27c865a7c\") " pod="calico-system/csi-node-driver-p68bw" Jan 23 17:27:05.594680 kubelet[2900]: E0123 17:27:05.594626 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.594680 kubelet[2900]: W0123 17:27:05.594635 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.594680 kubelet[2900]: E0123 17:27:05.594643 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.595836 kubelet[2900]: E0123 17:27:05.594772 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.595836 kubelet[2900]: W0123 17:27:05.594782 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.595836 kubelet[2900]: E0123 17:27:05.594788 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.595836 kubelet[2900]: E0123 17:27:05.595020 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.595836 kubelet[2900]: W0123 17:27:05.595029 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.595836 kubelet[2900]: E0123 17:27:05.595037 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.595836 kubelet[2900]: I0123 17:27:05.595058 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/baa2fc5a-f231-4d23-992e-37e27c865a7c-varrun\") pod \"csi-node-driver-p68bw\" (UID: \"baa2fc5a-f231-4d23-992e-37e27c865a7c\") " pod="calico-system/csi-node-driver-p68bw" Jan 23 17:27:05.596127 containerd[1675]: time="2026-01-23T17:27:05.594996945Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5776bd4fb5-gnfzm,Uid:4562a804-91ca-4b9e-962b-e8ef9fd6e802,Namespace:calico-system,Attempt:0,} returns sandbox id \"8dadaaadcade86e4b4307225bee26a7e1928c4d934c60ccc01b3fa9204aaddc7\"" Jan 23 17:27:05.596190 kubelet[2900]: E0123 17:27:05.595866 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.596190 kubelet[2900]: W0123 17:27:05.595885 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.596190 kubelet[2900]: E0123 17:27:05.595900 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.596606 kubelet[2900]: E0123 17:27:05.596565 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.596606 kubelet[2900]: W0123 17:27:05.596589 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.596606 kubelet[2900]: E0123 17:27:05.596605 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.597822 kubelet[2900]: E0123 17:27:05.597300 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.597822 kubelet[2900]: W0123 17:27:05.597335 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.597822 kubelet[2900]: E0123 17:27:05.597348 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.597822 kubelet[2900]: I0123 17:27:05.597414 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tpb9\" (UniqueName: \"kubernetes.io/projected/baa2fc5a-f231-4d23-992e-37e27c865a7c-kube-api-access-6tpb9\") pod \"csi-node-driver-p68bw\" (UID: \"baa2fc5a-f231-4d23-992e-37e27c865a7c\") " pod="calico-system/csi-node-driver-p68bw" Jan 23 17:27:05.598343 containerd[1675]: time="2026-01-23T17:27:05.598295921Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 23 17:27:05.598416 kubelet[2900]: E0123 17:27:05.598364 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.598416 kubelet[2900]: W0123 17:27:05.598381 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.598416 kubelet[2900]: E0123 17:27:05.598396 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.599253 kubelet[2900]: E0123 17:27:05.599232 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.599745 kubelet[2900]: W0123 17:27:05.599347 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.599745 kubelet[2900]: E0123 17:27:05.599368 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.600258 kubelet[2900]: E0123 17:27:05.600172 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.600879 kubelet[2900]: W0123 17:27:05.600383 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.600879 kubelet[2900]: E0123 17:27:05.600406 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.601877 kubelet[2900]: E0123 17:27:05.601727 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.601877 kubelet[2900]: W0123 17:27:05.601745 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.601877 kubelet[2900]: E0123 17:27:05.601757 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.646968 containerd[1675]: time="2026-01-23T17:27:05.646752359Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-km4sx,Uid:e36568b0-f45f-48dc-aac1-0c1aa7db269a,Namespace:calico-system,Attempt:0,}" Jan 23 17:27:05.664842 containerd[1675]: time="2026-01-23T17:27:05.664793088Z" level=info msg="connecting to shim 062a64adfc7f0a7ed4d5582cbb439ddac258ae5247aef948bb2508dfb4d60af5" address="unix:///run/containerd/s/dfef6d6a2f2ba92efbc359d8eb258a049469366905ea88edc501cc5c196a5241" namespace=k8s.io protocol=ttrpc version=3 Jan 23 17:27:05.685520 systemd[1]: Started cri-containerd-062a64adfc7f0a7ed4d5582cbb439ddac258ae5247aef948bb2508dfb4d60af5.scope - libcontainer container 062a64adfc7f0a7ed4d5582cbb439ddac258ae5247aef948bb2508dfb4d60af5. Jan 23 17:27:05.694000 audit: BPF prog-id=156 op=LOAD Jan 23 17:27:05.695000 audit: BPF prog-id=157 op=LOAD Jan 23 17:27:05.695000 audit[3438]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3427 pid=3438 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:05.695000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036326136346164666337663061376564346435353832636262343339 Jan 23 17:27:05.695000 audit: BPF prog-id=157 op=UNLOAD Jan 23 17:27:05.695000 audit[3438]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3427 pid=3438 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:05.695000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036326136346164666337663061376564346435353832636262343339 Jan 23 17:27:05.695000 audit: BPF prog-id=158 op=LOAD Jan 23 17:27:05.695000 audit[3438]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3427 pid=3438 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:05.695000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036326136346164666337663061376564346435353832636262343339 Jan 23 17:27:05.695000 audit: BPF prog-id=159 op=LOAD Jan 23 17:27:05.695000 audit[3438]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3427 pid=3438 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:05.695000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036326136346164666337663061376564346435353832636262343339 Jan 23 17:27:05.695000 audit: BPF prog-id=159 op=UNLOAD Jan 23 17:27:05.695000 audit[3438]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3427 pid=3438 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:05.695000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036326136346164666337663061376564346435353832636262343339 Jan 23 17:27:05.695000 audit: BPF prog-id=158 op=UNLOAD Jan 23 17:27:05.695000 audit[3438]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3427 pid=3438 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:05.695000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036326136346164666337663061376564346435353832636262343339 Jan 23 17:27:05.695000 audit: BPF prog-id=160 op=LOAD Jan 23 17:27:05.695000 audit[3438]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3427 pid=3438 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:05.695000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3036326136346164666337663061376564346435353832636262343339 Jan 23 17:27:05.698344 kubelet[2900]: E0123 17:27:05.698320 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.698669 kubelet[2900]: W0123 17:27:05.698439 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.698669 kubelet[2900]: E0123 17:27:05.698462 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.699234 kubelet[2900]: E0123 17:27:05.699202 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.699234 kubelet[2900]: W0123 17:27:05.699225 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.699300 kubelet[2900]: E0123 17:27:05.699239 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.699502 kubelet[2900]: E0123 17:27:05.699460 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.699502 kubelet[2900]: W0123 17:27:05.699474 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.699502 kubelet[2900]: E0123 17:27:05.699489 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.699805 kubelet[2900]: E0123 17:27:05.699778 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.699805 kubelet[2900]: W0123 17:27:05.699796 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.699805 kubelet[2900]: E0123 17:27:05.699808 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.700260 kubelet[2900]: E0123 17:27:05.699971 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.700260 kubelet[2900]: W0123 17:27:05.699979 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.700260 kubelet[2900]: E0123 17:27:05.699986 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.700260 kubelet[2900]: E0123 17:27:05.700251 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.700260 kubelet[2900]: W0123 17:27:05.700261 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.700483 kubelet[2900]: E0123 17:27:05.700271 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.700655 kubelet[2900]: E0123 17:27:05.700627 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.700701 kubelet[2900]: W0123 17:27:05.700684 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.700769 kubelet[2900]: E0123 17:27:05.700703 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.701421 kubelet[2900]: E0123 17:27:05.701402 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.701421 kubelet[2900]: W0123 17:27:05.701419 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.701490 kubelet[2900]: E0123 17:27:05.701432 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.701619 kubelet[2900]: E0123 17:27:05.701605 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.701656 kubelet[2900]: W0123 17:27:05.701617 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.701656 kubelet[2900]: E0123 17:27:05.701640 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.701846 kubelet[2900]: E0123 17:27:05.701831 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.701846 kubelet[2900]: W0123 17:27:05.701845 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.701916 kubelet[2900]: E0123 17:27:05.701855 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.702247 kubelet[2900]: E0123 17:27:05.702094 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.702424 kubelet[2900]: W0123 17:27:05.702250 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.702424 kubelet[2900]: E0123 17:27:05.702264 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.702530 kubelet[2900]: E0123 17:27:05.702514 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.702530 kubelet[2900]: W0123 17:27:05.702528 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.702842 kubelet[2900]: E0123 17:27:05.702539 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.702842 kubelet[2900]: E0123 17:27:05.702745 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.702842 kubelet[2900]: W0123 17:27:05.702759 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.702842 kubelet[2900]: E0123 17:27:05.702770 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.703194 kubelet[2900]: E0123 17:27:05.703165 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.703194 kubelet[2900]: W0123 17:27:05.703186 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.703248 kubelet[2900]: E0123 17:27:05.703199 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.703560 kubelet[2900]: E0123 17:27:05.703542 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.703560 kubelet[2900]: W0123 17:27:05.703560 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.703639 kubelet[2900]: E0123 17:27:05.703572 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.703791 kubelet[2900]: E0123 17:27:05.703763 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.703791 kubelet[2900]: W0123 17:27:05.703778 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.703791 kubelet[2900]: E0123 17:27:05.703788 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.704065 kubelet[2900]: E0123 17:27:05.704049 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.704065 kubelet[2900]: W0123 17:27:05.704064 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.704173 kubelet[2900]: E0123 17:27:05.704074 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.704259 kubelet[2900]: E0123 17:27:05.704243 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.704259 kubelet[2900]: W0123 17:27:05.704256 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.704381 kubelet[2900]: E0123 17:27:05.704265 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.704475 kubelet[2900]: E0123 17:27:05.704458 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.704475 kubelet[2900]: W0123 17:27:05.704472 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.704539 kubelet[2900]: E0123 17:27:05.704481 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.704704 kubelet[2900]: E0123 17:27:05.704689 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.704704 kubelet[2900]: W0123 17:27:05.704702 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.704761 kubelet[2900]: E0123 17:27:05.704712 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.704875 kubelet[2900]: E0123 17:27:05.704860 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.704875 kubelet[2900]: W0123 17:27:05.704872 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.705065 kubelet[2900]: E0123 17:27:05.704881 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.705156 kubelet[2900]: E0123 17:27:05.705139 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.705206 kubelet[2900]: W0123 17:27:05.705195 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.705258 kubelet[2900]: E0123 17:27:05.705246 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.705470 kubelet[2900]: E0123 17:27:05.705455 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.705662 kubelet[2900]: W0123 17:27:05.705521 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.705662 kubelet[2900]: E0123 17:27:05.705540 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.705811 kubelet[2900]: E0123 17:27:05.705797 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.705864 kubelet[2900]: W0123 17:27:05.705853 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.705912 kubelet[2900]: E0123 17:27:05.705901 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.706254 kubelet[2900]: E0123 17:27:05.706225 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.706254 kubelet[2900]: W0123 17:27:05.706244 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.706254 kubelet[2900]: E0123 17:27:05.706257 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:05.715295 containerd[1675]: time="2026-01-23T17:27:05.715255695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-km4sx,Uid:e36568b0-f45f-48dc-aac1-0c1aa7db269a,Namespace:calico-system,Attempt:0,} returns sandbox id \"062a64adfc7f0a7ed4d5582cbb439ddac258ae5247aef948bb2508dfb4d60af5\"" Jan 23 17:27:05.718400 kubelet[2900]: E0123 17:27:05.718373 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:05.718400 kubelet[2900]: W0123 17:27:05.718396 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:05.718914 kubelet[2900]: E0123 17:27:05.718891 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:06.136000 audit[3492]: NETFILTER_CFG table=filter:115 family=2 entries=22 op=nft_register_rule pid=3492 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:06.136000 audit[3492]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffeb4a5c30 a2=0 a3=1 items=0 ppid=3013 pid=3492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:06.136000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:06.148000 audit[3492]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3492 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:06.148000 audit[3492]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffeb4a5c30 a2=0 a3=1 items=0 ppid=3013 pid=3492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:06.148000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:07.319907 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2722890853.mount: Deactivated successfully. Jan 23 17:27:07.560931 kubelet[2900]: E0123 17:27:07.560820 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:27:08.418437 containerd[1675]: time="2026-01-23T17:27:08.418364955Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:27:08.419351 containerd[1675]: time="2026-01-23T17:27:08.419289160Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=31719231" Jan 23 17:27:08.420418 containerd[1675]: time="2026-01-23T17:27:08.420374565Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:27:08.422689 containerd[1675]: time="2026-01-23T17:27:08.422630456Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:27:08.423287 containerd[1675]: time="2026-01-23T17:27:08.423186979Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 2.824682017s" Jan 23 17:27:08.423287 containerd[1675]: time="2026-01-23T17:27:08.423220459Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Jan 23 17:27:08.424467 containerd[1675]: time="2026-01-23T17:27:08.424431385Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 23 17:27:08.434341 containerd[1675]: time="2026-01-23T17:27:08.434271073Z" level=info msg="CreateContainer within sandbox \"8dadaaadcade86e4b4307225bee26a7e1928c4d934c60ccc01b3fa9204aaddc7\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 23 17:27:08.443387 containerd[1675]: time="2026-01-23T17:27:08.443333878Z" level=info msg="Container 2e12fe4417435f343ee63d65810d7d00ff51f766a5f76ed0d7cb0828298a40b3: CDI devices from CRI Config.CDIDevices: []" Jan 23 17:27:08.449810 containerd[1675]: time="2026-01-23T17:27:08.449760149Z" level=info msg="CreateContainer within sandbox \"8dadaaadcade86e4b4307225bee26a7e1928c4d934c60ccc01b3fa9204aaddc7\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"2e12fe4417435f343ee63d65810d7d00ff51f766a5f76ed0d7cb0828298a40b3\"" Jan 23 17:27:08.450589 containerd[1675]: time="2026-01-23T17:27:08.450563233Z" level=info msg="StartContainer for \"2e12fe4417435f343ee63d65810d7d00ff51f766a5f76ed0d7cb0828298a40b3\"" Jan 23 17:27:08.451678 containerd[1675]: time="2026-01-23T17:27:08.451652559Z" level=info msg="connecting to shim 2e12fe4417435f343ee63d65810d7d00ff51f766a5f76ed0d7cb0828298a40b3" address="unix:///run/containerd/s/452a936a08e778997a0b6f3e2eb077a100d18f032483bbd551b397d33316f5c5" protocol=ttrpc version=3 Jan 23 17:27:08.470522 systemd[1]: Started cri-containerd-2e12fe4417435f343ee63d65810d7d00ff51f766a5f76ed0d7cb0828298a40b3.scope - libcontainer container 2e12fe4417435f343ee63d65810d7d00ff51f766a5f76ed0d7cb0828298a40b3. Jan 23 17:27:08.481000 audit: BPF prog-id=161 op=LOAD Jan 23 17:27:08.482732 kernel: kauditd_printk_skb: 58 callbacks suppressed Jan 23 17:27:08.482800 kernel: audit: type=1334 audit(1769189228.481:549): prog-id=161 op=LOAD Jan 23 17:27:08.482000 audit: BPF prog-id=162 op=LOAD Jan 23 17:27:08.483837 kernel: audit: type=1334 audit(1769189228.482:550): prog-id=162 op=LOAD Jan 23 17:27:08.483872 kernel: audit: type=1300 audit(1769189228.482:550): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3333 pid=3504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:08.482000 audit[3504]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3333 pid=3504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:08.482000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265313266653434313734333566333433656536336436353831306437 Jan 23 17:27:08.489992 kernel: audit: type=1327 audit(1769189228.482:550): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265313266653434313734333566333433656536336436353831306437 Jan 23 17:27:08.490100 kernel: audit: type=1334 audit(1769189228.482:551): prog-id=162 op=UNLOAD Jan 23 17:27:08.482000 audit: BPF prog-id=162 op=UNLOAD Jan 23 17:27:08.490346 kernel: audit: type=1300 audit(1769189228.482:551): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3333 pid=3504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:08.482000 audit[3504]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3333 pid=3504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:08.493110 kernel: audit: type=1327 audit(1769189228.482:551): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265313266653434313734333566333433656536336436353831306437 Jan 23 17:27:08.482000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265313266653434313734333566333433656536336436353831306437 Jan 23 17:27:08.495948 kernel: audit: type=1334 audit(1769189228.484:552): prog-id=163 op=LOAD Jan 23 17:27:08.484000 audit: BPF prog-id=163 op=LOAD Jan 23 17:27:08.484000 audit[3504]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3333 pid=3504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:08.499627 kernel: audit: type=1300 audit(1769189228.484:552): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3333 pid=3504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:08.484000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265313266653434313734333566333433656536336436353831306437 Jan 23 17:27:08.502554 kernel: audit: type=1327 audit(1769189228.484:552): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265313266653434313734333566333433656536336436353831306437 Jan 23 17:27:08.486000 audit: BPF prog-id=164 op=LOAD Jan 23 17:27:08.486000 audit[3504]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3333 pid=3504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:08.486000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265313266653434313734333566333433656536336436353831306437 Jan 23 17:27:08.490000 audit: BPF prog-id=164 op=UNLOAD Jan 23 17:27:08.490000 audit[3504]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3333 pid=3504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:08.490000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265313266653434313734333566333433656536336436353831306437 Jan 23 17:27:08.490000 audit: BPF prog-id=163 op=UNLOAD Jan 23 17:27:08.490000 audit[3504]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3333 pid=3504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:08.490000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265313266653434313734333566333433656536336436353831306437 Jan 23 17:27:08.490000 audit: BPF prog-id=165 op=LOAD Jan 23 17:27:08.490000 audit[3504]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3333 pid=3504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:08.490000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265313266653434313734333566333433656536336436353831306437 Jan 23 17:27:08.526066 containerd[1675]: time="2026-01-23T17:27:08.526029163Z" level=info msg="StartContainer for \"2e12fe4417435f343ee63d65810d7d00ff51f766a5f76ed0d7cb0828298a40b3\" returns successfully" Jan 23 17:27:08.713247 kubelet[2900]: E0123 17:27:08.712429 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.713247 kubelet[2900]: W0123 17:27:08.712455 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.713247 kubelet[2900]: E0123 17:27:08.712476 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.714397 kubelet[2900]: E0123 17:27:08.713447 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.714397 kubelet[2900]: W0123 17:27:08.713486 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.714397 kubelet[2900]: E0123 17:27:08.713531 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.714397 kubelet[2900]: E0123 17:27:08.714361 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.714397 kubelet[2900]: W0123 17:27:08.714377 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.714397 kubelet[2900]: E0123 17:27:08.714390 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.715339 kubelet[2900]: E0123 17:27:08.715318 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.715339 kubelet[2900]: W0123 17:27:08.715335 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.715442 kubelet[2900]: E0123 17:27:08.715347 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.715563 kubelet[2900]: E0123 17:27:08.715545 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.715563 kubelet[2900]: W0123 17:27:08.715558 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.715621 kubelet[2900]: E0123 17:27:08.715566 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.716427 kubelet[2900]: E0123 17:27:08.716401 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.716427 kubelet[2900]: W0123 17:27:08.716418 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.716427 kubelet[2900]: E0123 17:27:08.716431 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.717478 kubelet[2900]: E0123 17:27:08.717456 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.717478 kubelet[2900]: W0123 17:27:08.717472 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.717582 kubelet[2900]: E0123 17:27:08.717484 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.717701 kubelet[2900]: E0123 17:27:08.717682 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.717701 kubelet[2900]: W0123 17:27:08.717695 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.717757 kubelet[2900]: E0123 17:27:08.717704 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.717937 kubelet[2900]: E0123 17:27:08.717856 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.717937 kubelet[2900]: W0123 17:27:08.717866 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.717937 kubelet[2900]: E0123 17:27:08.717874 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.718072 kubelet[2900]: E0123 17:27:08.718035 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.718072 kubelet[2900]: W0123 17:27:08.718047 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.718072 kubelet[2900]: E0123 17:27:08.718056 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.719462 kubelet[2900]: E0123 17:27:08.719439 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.719462 kubelet[2900]: W0123 17:27:08.719455 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.719462 kubelet[2900]: E0123 17:27:08.719466 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.719675 kubelet[2900]: E0123 17:27:08.719653 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.719675 kubelet[2900]: W0123 17:27:08.719673 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.719745 kubelet[2900]: E0123 17:27:08.719682 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.719843 kubelet[2900]: E0123 17:27:08.719823 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.719843 kubelet[2900]: W0123 17:27:08.719836 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.719843 kubelet[2900]: E0123 17:27:08.719843 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.719978 kubelet[2900]: E0123 17:27:08.719963 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.719978 kubelet[2900]: W0123 17:27:08.719974 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.720044 kubelet[2900]: E0123 17:27:08.719983 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.720118 kubelet[2900]: E0123 17:27:08.720100 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.720118 kubelet[2900]: W0123 17:27:08.720112 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.720172 kubelet[2900]: E0123 17:27:08.720123 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.726917 kubelet[2900]: E0123 17:27:08.726885 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.726917 kubelet[2900]: W0123 17:27:08.726908 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.726917 kubelet[2900]: E0123 17:27:08.726926 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.727618 kubelet[2900]: E0123 17:27:08.727162 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.727618 kubelet[2900]: W0123 17:27:08.727173 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.727618 kubelet[2900]: E0123 17:27:08.727183 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.727966 kubelet[2900]: E0123 17:27:08.727846 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.727966 kubelet[2900]: W0123 17:27:08.727865 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.727966 kubelet[2900]: E0123 17:27:08.727877 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.728124 kubelet[2900]: E0123 17:27:08.728112 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.728189 kubelet[2900]: W0123 17:27:08.728178 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.728243 kubelet[2900]: E0123 17:27:08.728233 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.728574 kubelet[2900]: E0123 17:27:08.728475 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.728574 kubelet[2900]: W0123 17:27:08.728487 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.728574 kubelet[2900]: E0123 17:27:08.728496 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.728743 kubelet[2900]: E0123 17:27:08.728729 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.728803 kubelet[2900]: W0123 17:27:08.728792 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.728937 kubelet[2900]: E0123 17:27:08.728846 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.729536 kubelet[2900]: E0123 17:27:08.729215 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.729536 kubelet[2900]: W0123 17:27:08.729234 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.729536 kubelet[2900]: E0123 17:27:08.729248 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.730326 kubelet[2900]: E0123 17:27:08.730012 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.730326 kubelet[2900]: W0123 17:27:08.730028 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.730326 kubelet[2900]: E0123 17:27:08.730043 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.731501 kubelet[2900]: E0123 17:27:08.731430 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.731501 kubelet[2900]: W0123 17:27:08.731454 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.731501 kubelet[2900]: E0123 17:27:08.731471 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.731883 kubelet[2900]: E0123 17:27:08.731803 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.731883 kubelet[2900]: W0123 17:27:08.731819 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.731883 kubelet[2900]: E0123 17:27:08.731834 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.732079 kubelet[2900]: E0123 17:27:08.732056 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.732079 kubelet[2900]: W0123 17:27:08.732073 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.732148 kubelet[2900]: E0123 17:27:08.732082 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.732416 kubelet[2900]: E0123 17:27:08.732330 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.732416 kubelet[2900]: W0123 17:27:08.732345 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.732416 kubelet[2900]: E0123 17:27:08.732356 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.732915 kubelet[2900]: E0123 17:27:08.732559 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.732915 kubelet[2900]: W0123 17:27:08.732572 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.732915 kubelet[2900]: E0123 17:27:08.732580 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.732915 kubelet[2900]: E0123 17:27:08.732776 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.732915 kubelet[2900]: W0123 17:27:08.732786 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.732915 kubelet[2900]: E0123 17:27:08.732795 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.733551 kubelet[2900]: E0123 17:27:08.733528 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.733551 kubelet[2900]: W0123 17:27:08.733544 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.733551 kubelet[2900]: E0123 17:27:08.733555 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.733907 kubelet[2900]: E0123 17:27:08.733883 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.733907 kubelet[2900]: W0123 17:27:08.733899 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.733907 kubelet[2900]: E0123 17:27:08.733908 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.734652 kubelet[2900]: E0123 17:27:08.734618 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.734652 kubelet[2900]: W0123 17:27:08.734634 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.734652 kubelet[2900]: E0123 17:27:08.734647 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:08.735283 kubelet[2900]: E0123 17:27:08.735261 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:08.735283 kubelet[2900]: W0123 17:27:08.735279 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:08.735388 kubelet[2900]: E0123 17:27:08.735291 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.560649 kubelet[2900]: E0123 17:27:09.560577 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:27:09.642759 kubelet[2900]: I0123 17:27:09.642727 2900 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 23 17:27:09.724976 kubelet[2900]: E0123 17:27:09.724944 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.724976 kubelet[2900]: W0123 17:27:09.724968 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.724976 kubelet[2900]: E0123 17:27:09.724987 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.725378 kubelet[2900]: E0123 17:27:09.725165 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.725378 kubelet[2900]: W0123 17:27:09.725171 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.725378 kubelet[2900]: E0123 17:27:09.725179 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.725378 kubelet[2900]: E0123 17:27:09.725319 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.725378 kubelet[2900]: W0123 17:27:09.725327 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.725378 kubelet[2900]: E0123 17:27:09.725334 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.725520 kubelet[2900]: E0123 17:27:09.725496 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.725520 kubelet[2900]: W0123 17:27:09.725515 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.725583 kubelet[2900]: E0123 17:27:09.725524 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.725700 kubelet[2900]: E0123 17:27:09.725684 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.725700 kubelet[2900]: W0123 17:27:09.725693 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.725700 kubelet[2900]: E0123 17:27:09.725700 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.725847 kubelet[2900]: E0123 17:27:09.725836 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.725847 kubelet[2900]: W0123 17:27:09.725846 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.725902 kubelet[2900]: E0123 17:27:09.725853 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.725976 kubelet[2900]: E0123 17:27:09.725967 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.725976 kubelet[2900]: W0123 17:27:09.725975 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.726040 kubelet[2900]: E0123 17:27:09.725983 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.726106 kubelet[2900]: E0123 17:27:09.726097 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.726106 kubelet[2900]: W0123 17:27:09.726106 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.726158 kubelet[2900]: E0123 17:27:09.726113 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.726251 kubelet[2900]: E0123 17:27:09.726241 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.726251 kubelet[2900]: W0123 17:27:09.726250 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.726316 kubelet[2900]: E0123 17:27:09.726258 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.726400 kubelet[2900]: E0123 17:27:09.726390 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.726400 kubelet[2900]: W0123 17:27:09.726400 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.726459 kubelet[2900]: E0123 17:27:09.726407 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.726540 kubelet[2900]: E0123 17:27:09.726530 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.726540 kubelet[2900]: W0123 17:27:09.726539 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.726592 kubelet[2900]: E0123 17:27:09.726547 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.726703 kubelet[2900]: E0123 17:27:09.726692 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.726730 kubelet[2900]: W0123 17:27:09.726703 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.726730 kubelet[2900]: E0123 17:27:09.726711 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.726868 kubelet[2900]: E0123 17:27:09.726851 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.726868 kubelet[2900]: W0123 17:27:09.726868 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.726928 kubelet[2900]: E0123 17:27:09.726876 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.727006 kubelet[2900]: E0123 17:27:09.726995 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.727006 kubelet[2900]: W0123 17:27:09.727006 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.727058 kubelet[2900]: E0123 17:27:09.727014 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.727144 kubelet[2900]: E0123 17:27:09.727135 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.727144 kubelet[2900]: W0123 17:27:09.727144 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.727196 kubelet[2900]: E0123 17:27:09.727151 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.734605 kubelet[2900]: E0123 17:27:09.734538 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.734605 kubelet[2900]: W0123 17:27:09.734556 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.734605 kubelet[2900]: E0123 17:27:09.734567 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.734769 kubelet[2900]: E0123 17:27:09.734755 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.734769 kubelet[2900]: W0123 17:27:09.734765 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.734853 kubelet[2900]: E0123 17:27:09.734772 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.734944 kubelet[2900]: E0123 17:27:09.734933 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.734944 kubelet[2900]: W0123 17:27:09.734943 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.735001 kubelet[2900]: E0123 17:27:09.734950 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.735157 kubelet[2900]: E0123 17:27:09.735136 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.735157 kubelet[2900]: W0123 17:27:09.735154 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.735220 kubelet[2900]: E0123 17:27:09.735164 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.735349 kubelet[2900]: E0123 17:27:09.735339 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.735349 kubelet[2900]: W0123 17:27:09.735349 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.735420 kubelet[2900]: E0123 17:27:09.735356 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.735508 kubelet[2900]: E0123 17:27:09.735496 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.735508 kubelet[2900]: W0123 17:27:09.735505 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.735593 kubelet[2900]: E0123 17:27:09.735513 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.735688 kubelet[2900]: E0123 17:27:09.735674 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.735688 kubelet[2900]: W0123 17:27:09.735687 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.735737 kubelet[2900]: E0123 17:27:09.735695 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.735948 kubelet[2900]: E0123 17:27:09.735911 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.735948 kubelet[2900]: W0123 17:27:09.735924 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.735948 kubelet[2900]: E0123 17:27:09.735932 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.736402 kubelet[2900]: E0123 17:27:09.736373 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.736402 kubelet[2900]: W0123 17:27:09.736388 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.736402 kubelet[2900]: E0123 17:27:09.736398 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.736557 kubelet[2900]: E0123 17:27:09.736545 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.736557 kubelet[2900]: W0123 17:27:09.736554 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.736618 kubelet[2900]: E0123 17:27:09.736562 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.736732 kubelet[2900]: E0123 17:27:09.736714 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.736732 kubelet[2900]: W0123 17:27:09.736723 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.736732 kubelet[2900]: E0123 17:27:09.736730 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.736869 kubelet[2900]: E0123 17:27:09.736858 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.736869 kubelet[2900]: W0123 17:27:09.736867 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.736949 kubelet[2900]: E0123 17:27:09.736873 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.737083 kubelet[2900]: E0123 17:27:09.737069 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.737083 kubelet[2900]: W0123 17:27:09.737081 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.737144 kubelet[2900]: E0123 17:27:09.737088 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.737416 kubelet[2900]: E0123 17:27:09.737402 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.737416 kubelet[2900]: W0123 17:27:09.737414 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.737474 kubelet[2900]: E0123 17:27:09.737423 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.737599 kubelet[2900]: E0123 17:27:09.737586 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.737599 kubelet[2900]: W0123 17:27:09.737598 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.737645 kubelet[2900]: E0123 17:27:09.737606 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.737783 kubelet[2900]: E0123 17:27:09.737772 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.737783 kubelet[2900]: W0123 17:27:09.737782 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.737833 kubelet[2900]: E0123 17:27:09.737790 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.738052 kubelet[2900]: E0123 17:27:09.738039 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.738052 kubelet[2900]: W0123 17:27:09.738050 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.738123 kubelet[2900]: E0123 17:27:09.738058 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.738258 kubelet[2900]: E0123 17:27:09.738245 2900 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 23 17:27:09.738258 kubelet[2900]: W0123 17:27:09.738256 2900 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 23 17:27:09.738323 kubelet[2900]: E0123 17:27:09.738264 2900 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 23 17:27:09.912417 containerd[1675]: time="2026-01-23T17:27:09.912325004Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:27:09.913888 containerd[1675]: time="2026-01-23T17:27:09.913676971Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 23 17:27:09.914927 containerd[1675]: time="2026-01-23T17:27:09.914883216Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:27:09.917378 containerd[1675]: time="2026-01-23T17:27:09.917348989Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:27:09.918316 containerd[1675]: time="2026-01-23T17:27:09.918277673Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.493806408s" Jan 23 17:27:09.918397 containerd[1675]: time="2026-01-23T17:27:09.918328193Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Jan 23 17:27:09.922914 containerd[1675]: time="2026-01-23T17:27:09.922873576Z" level=info msg="CreateContainer within sandbox \"062a64adfc7f0a7ed4d5582cbb439ddac258ae5247aef948bb2508dfb4d60af5\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 23 17:27:09.935634 containerd[1675]: time="2026-01-23T17:27:09.935546478Z" level=info msg="Container 076585cb4d107f7d6b5f6b24632ad8bf315dc1ebe41987573fcdc90c52b6fda2: CDI devices from CRI Config.CDIDevices: []" Jan 23 17:27:09.948004 containerd[1675]: time="2026-01-23T17:27:09.947953059Z" level=info msg="CreateContainer within sandbox \"062a64adfc7f0a7ed4d5582cbb439ddac258ae5247aef948bb2508dfb4d60af5\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"076585cb4d107f7d6b5f6b24632ad8bf315dc1ebe41987573fcdc90c52b6fda2\"" Jan 23 17:27:09.948572 containerd[1675]: time="2026-01-23T17:27:09.948539022Z" level=info msg="StartContainer for \"076585cb4d107f7d6b5f6b24632ad8bf315dc1ebe41987573fcdc90c52b6fda2\"" Jan 23 17:27:09.950055 containerd[1675]: time="2026-01-23T17:27:09.950027429Z" level=info msg="connecting to shim 076585cb4d107f7d6b5f6b24632ad8bf315dc1ebe41987573fcdc90c52b6fda2" address="unix:///run/containerd/s/dfef6d6a2f2ba92efbc359d8eb258a049469366905ea88edc501cc5c196a5241" protocol=ttrpc version=3 Jan 23 17:27:09.969493 systemd[1]: Started cri-containerd-076585cb4d107f7d6b5f6b24632ad8bf315dc1ebe41987573fcdc90c52b6fda2.scope - libcontainer container 076585cb4d107f7d6b5f6b24632ad8bf315dc1ebe41987573fcdc90c52b6fda2. Jan 23 17:27:10.029000 audit: BPF prog-id=166 op=LOAD Jan 23 17:27:10.029000 audit[3617]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3427 pid=3617 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:10.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3037363538356362346431303766376436623566366232343633326164 Jan 23 17:27:10.029000 audit: BPF prog-id=167 op=LOAD Jan 23 17:27:10.029000 audit[3617]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3427 pid=3617 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:10.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3037363538356362346431303766376436623566366232343633326164 Jan 23 17:27:10.029000 audit: BPF prog-id=167 op=UNLOAD Jan 23 17:27:10.029000 audit[3617]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3427 pid=3617 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:10.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3037363538356362346431303766376436623566366232343633326164 Jan 23 17:27:10.029000 audit: BPF prog-id=166 op=UNLOAD Jan 23 17:27:10.029000 audit[3617]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3427 pid=3617 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:10.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3037363538356362346431303766376436623566366232343633326164 Jan 23 17:27:10.029000 audit: BPF prog-id=168 op=LOAD Jan 23 17:27:10.029000 audit[3617]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3427 pid=3617 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:10.029000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3037363538356362346431303766376436623566366232343633326164 Jan 23 17:27:10.045993 containerd[1675]: time="2026-01-23T17:27:10.045956059Z" level=info msg="StartContainer for \"076585cb4d107f7d6b5f6b24632ad8bf315dc1ebe41987573fcdc90c52b6fda2\" returns successfully" Jan 23 17:27:10.061225 systemd[1]: cri-containerd-076585cb4d107f7d6b5f6b24632ad8bf315dc1ebe41987573fcdc90c52b6fda2.scope: Deactivated successfully. Jan 23 17:27:10.064194 containerd[1675]: time="2026-01-23T17:27:10.064157669Z" level=info msg="received container exit event container_id:\"076585cb4d107f7d6b5f6b24632ad8bf315dc1ebe41987573fcdc90c52b6fda2\" id:\"076585cb4d107f7d6b5f6b24632ad8bf315dc1ebe41987573fcdc90c52b6fda2\" pid:3629 exited_at:{seconds:1769189230 nanos:63736547}" Jan 23 17:27:10.068000 audit: BPF prog-id=168 op=UNLOAD Jan 23 17:27:10.087017 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-076585cb4d107f7d6b5f6b24632ad8bf315dc1ebe41987573fcdc90c52b6fda2-rootfs.mount: Deactivated successfully. Jan 23 17:27:10.647717 containerd[1675]: time="2026-01-23T17:27:10.647666971Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 23 17:27:10.663569 kubelet[2900]: I0123 17:27:10.663503 2900 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5776bd4fb5-gnfzm" podStartSLOduration=2.836699662 podStartE2EDuration="5.663486689s" podCreationTimestamp="2026-01-23 17:27:05 +0000 UTC" firstStartedPulling="2026-01-23 17:27:05.597444677 +0000 UTC m=+23.132988369" lastFinishedPulling="2026-01-23 17:27:08.424231704 +0000 UTC m=+25.959775396" observedRunningTime="2026-01-23 17:27:08.664126561 +0000 UTC m=+26.199670253" watchObservedRunningTime="2026-01-23 17:27:10.663486689 +0000 UTC m=+28.199030381" Jan 23 17:27:11.560683 kubelet[2900]: E0123 17:27:11.560599 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:27:13.560410 kubelet[2900]: E0123 17:27:13.560364 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:27:13.952156 containerd[1675]: time="2026-01-23T17:27:13.952105181Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:27:13.953557 containerd[1675]: time="2026-01-23T17:27:13.953505028Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65921248" Jan 23 17:27:13.955346 containerd[1675]: time="2026-01-23T17:27:13.955032556Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:27:13.957781 containerd[1675]: time="2026-01-23T17:27:13.957741849Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:27:13.958513 containerd[1675]: time="2026-01-23T17:27:13.958477932Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 3.310770521s" Jan 23 17:27:13.958513 containerd[1675]: time="2026-01-23T17:27:13.958523933Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Jan 23 17:27:13.962832 containerd[1675]: time="2026-01-23T17:27:13.962733033Z" level=info msg="CreateContainer within sandbox \"062a64adfc7f0a7ed4d5582cbb439ddac258ae5247aef948bb2508dfb4d60af5\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 23 17:27:13.972613 containerd[1675]: time="2026-01-23T17:27:13.971455196Z" level=info msg="Container a9d7ad53158d8e37c1fccb18e769c10fd44908f5c6c8dbcb19077627326d6e90: CDI devices from CRI Config.CDIDevices: []" Jan 23 17:27:13.980654 containerd[1675]: time="2026-01-23T17:27:13.980619121Z" level=info msg="CreateContainer within sandbox \"062a64adfc7f0a7ed4d5582cbb439ddac258ae5247aef948bb2508dfb4d60af5\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"a9d7ad53158d8e37c1fccb18e769c10fd44908f5c6c8dbcb19077627326d6e90\"" Jan 23 17:27:13.981326 containerd[1675]: time="2026-01-23T17:27:13.981287004Z" level=info msg="StartContainer for \"a9d7ad53158d8e37c1fccb18e769c10fd44908f5c6c8dbcb19077627326d6e90\"" Jan 23 17:27:13.983456 containerd[1675]: time="2026-01-23T17:27:13.983410935Z" level=info msg="connecting to shim a9d7ad53158d8e37c1fccb18e769c10fd44908f5c6c8dbcb19077627326d6e90" address="unix:///run/containerd/s/dfef6d6a2f2ba92efbc359d8eb258a049469366905ea88edc501cc5c196a5241" protocol=ttrpc version=3 Jan 23 17:27:14.002535 systemd[1]: Started cri-containerd-a9d7ad53158d8e37c1fccb18e769c10fd44908f5c6c8dbcb19077627326d6e90.scope - libcontainer container a9d7ad53158d8e37c1fccb18e769c10fd44908f5c6c8dbcb19077627326d6e90. Jan 23 17:27:14.085739 kernel: kauditd_printk_skb: 28 callbacks suppressed Jan 23 17:27:14.085843 kernel: audit: type=1334 audit(1769189234.084:563): prog-id=169 op=LOAD Jan 23 17:27:14.084000 audit: BPF prog-id=169 op=LOAD Jan 23 17:27:14.084000 audit[3676]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3427 pid=3676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:14.084000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139643761643533313538643865333763316663636231386537363963 Jan 23 17:27:14.092294 kernel: audit: type=1300 audit(1769189234.084:563): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3427 pid=3676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:14.092364 kernel: audit: type=1327 audit(1769189234.084:563): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139643761643533313538643865333763316663636231386537363963 Jan 23 17:27:14.085000 audit: BPF prog-id=170 op=LOAD Jan 23 17:27:14.093384 kernel: audit: type=1334 audit(1769189234.085:564): prog-id=170 op=LOAD Jan 23 17:27:14.085000 audit[3676]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3427 pid=3676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:14.096481 kernel: audit: type=1300 audit(1769189234.085:564): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3427 pid=3676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:14.085000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139643761643533313538643865333763316663636231386537363963 Jan 23 17:27:14.099586 kernel: audit: type=1327 audit(1769189234.085:564): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139643761643533313538643865333763316663636231386537363963 Jan 23 17:27:14.099914 kernel: audit: type=1334 audit(1769189234.085:565): prog-id=170 op=UNLOAD Jan 23 17:27:14.085000 audit: BPF prog-id=170 op=UNLOAD Jan 23 17:27:14.085000 audit[3676]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3427 pid=3676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:14.103592 kernel: audit: type=1300 audit(1769189234.085:565): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3427 pid=3676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:14.103686 kernel: audit: type=1327 audit(1769189234.085:565): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139643761643533313538643865333763316663636231386537363963 Jan 23 17:27:14.085000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139643761643533313538643865333763316663636231386537363963 Jan 23 17:27:14.085000 audit: BPF prog-id=169 op=UNLOAD Jan 23 17:27:14.107350 kernel: audit: type=1334 audit(1769189234.085:566): prog-id=169 op=UNLOAD Jan 23 17:27:14.085000 audit[3676]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3427 pid=3676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:14.085000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139643761643533313538643865333763316663636231386537363963 Jan 23 17:27:14.085000 audit: BPF prog-id=171 op=LOAD Jan 23 17:27:14.085000 audit[3676]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3427 pid=3676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:14.085000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139643761643533313538643865333763316663636231386537363963 Jan 23 17:27:14.121816 containerd[1675]: time="2026-01-23T17:27:14.121778374Z" level=info msg="StartContainer for \"a9d7ad53158d8e37c1fccb18e769c10fd44908f5c6c8dbcb19077627326d6e90\" returns successfully" Jan 23 17:27:14.515250 containerd[1675]: time="2026-01-23T17:27:14.515196583Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 23 17:27:14.516025 systemd[1]: cri-containerd-a9d7ad53158d8e37c1fccb18e769c10fd44908f5c6c8dbcb19077627326d6e90.scope: Deactivated successfully. Jan 23 17:27:14.517149 containerd[1675]: time="2026-01-23T17:27:14.517111393Z" level=info msg="received container exit event container_id:\"a9d7ad53158d8e37c1fccb18e769c10fd44908f5c6c8dbcb19077627326d6e90\" id:\"a9d7ad53158d8e37c1fccb18e769c10fd44908f5c6c8dbcb19077627326d6e90\" pid:3690 exited_at:{seconds:1769189234 nanos:516876472}" Jan 23 17:27:14.517471 systemd[1]: cri-containerd-a9d7ad53158d8e37c1fccb18e769c10fd44908f5c6c8dbcb19077627326d6e90.scope: Consumed 462ms CPU time, 193.8M memory peak, 165.9M written to disk. Jan 23 17:27:14.523000 audit: BPF prog-id=171 op=UNLOAD Jan 23 17:27:14.536806 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a9d7ad53158d8e37c1fccb18e769c10fd44908f5c6c8dbcb19077627326d6e90-rootfs.mount: Deactivated successfully. Jan 23 17:27:14.544769 kubelet[2900]: I0123 17:27:14.544721 2900 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Jan 23 17:27:14.596487 systemd[1]: Created slice kubepods-besteffort-podd8d63139_4f9a_445f_a5a9_c69b14220343.slice - libcontainer container kubepods-besteffort-podd8d63139_4f9a_445f_a5a9_c69b14220343.slice. Jan 23 17:27:14.612899 systemd[1]: Created slice kubepods-burstable-pod5f83fc99_aaad_4d50_822f_b091c1ee0475.slice - libcontainer container kubepods-burstable-pod5f83fc99_aaad_4d50_822f_b091c1ee0475.slice. Jan 23 17:27:14.626126 systemd[1]: Created slice kubepods-burstable-pod972bd28e_0f5b_4146_80fb_f01380586d1b.slice - libcontainer container kubepods-burstable-pod972bd28e_0f5b_4146_80fb_f01380586d1b.slice. Jan 23 17:27:14.633051 systemd[1]: Created slice kubepods-besteffort-podbad712f1_634f_4629_aa4d_a4a0636cb622.slice - libcontainer container kubepods-besteffort-podbad712f1_634f_4629_aa4d_a4a0636cb622.slice. Jan 23 17:27:14.638584 systemd[1]: Created slice kubepods-besteffort-pod4e4a5b57_4234_49d1_b775_45759cc5cd06.slice - libcontainer container kubepods-besteffort-pod4e4a5b57_4234_49d1_b775_45759cc5cd06.slice. Jan 23 17:27:14.645691 systemd[1]: Created slice kubepods-besteffort-podbdc6b38c_9de0_4613_a814_03b925209707.slice - libcontainer container kubepods-besteffort-podbdc6b38c_9de0_4613_a814_03b925209707.slice. Jan 23 17:27:14.653020 systemd[1]: Created slice kubepods-besteffort-pod7377c58d_7984_42ea_99c0_3f9fee9f0b42.slice - libcontainer container kubepods-besteffort-pod7377c58d_7984_42ea_99c0_3f9fee9f0b42.slice. Jan 23 17:27:14.662835 containerd[1675]: time="2026-01-23T17:27:14.662796188Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 23 17:27:14.666333 kubelet[2900]: I0123 17:27:14.666056 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frq95\" (UniqueName: \"kubernetes.io/projected/7377c58d-7984-42ea-99c0-3f9fee9f0b42-kube-api-access-frq95\") pod \"whisker-575b58b9b5-hb7t2\" (UID: \"7377c58d-7984-42ea-99c0-3f9fee9f0b42\") " pod="calico-system/whisker-575b58b9b5-hb7t2" Jan 23 17:27:14.667861 kubelet[2900]: I0123 17:27:14.667213 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsgcz\" (UniqueName: \"kubernetes.io/projected/d8d63139-4f9a-445f-a5a9-c69b14220343-kube-api-access-fsgcz\") pod \"calico-kube-controllers-5bcfdccd6d-ph9sw\" (UID: \"d8d63139-4f9a-445f-a5a9-c69b14220343\") " pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" Jan 23 17:27:14.667861 kubelet[2900]: I0123 17:27:14.667418 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e4a5b57-4234-49d1-b775-45759cc5cd06-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-ssq4v\" (UID: \"4e4a5b57-4234-49d1-b775-45759cc5cd06\") " pod="calico-system/goldmane-7c778bb748-ssq4v" Jan 23 17:27:14.667861 kubelet[2900]: I0123 17:27:14.667446 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7377c58d-7984-42ea-99c0-3f9fee9f0b42-whisker-backend-key-pair\") pod \"whisker-575b58b9b5-hb7t2\" (UID: \"7377c58d-7984-42ea-99c0-3f9fee9f0b42\") " pod="calico-system/whisker-575b58b9b5-hb7t2" Jan 23 17:27:14.667861 kubelet[2900]: I0123 17:27:14.667465 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdm9q\" (UniqueName: \"kubernetes.io/projected/4e4a5b57-4234-49d1-b775-45759cc5cd06-kube-api-access-bdm9q\") pod \"goldmane-7c778bb748-ssq4v\" (UID: \"4e4a5b57-4234-49d1-b775-45759cc5cd06\") " pod="calico-system/goldmane-7c778bb748-ssq4v" Jan 23 17:27:14.667861 kubelet[2900]: I0123 17:27:14.667480 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8rtm\" (UniqueName: \"kubernetes.io/projected/bad712f1-634f-4629-aa4d-a4a0636cb622-kube-api-access-q8rtm\") pod \"calico-apiserver-dd7cc585d-gp992\" (UID: \"bad712f1-634f-4629-aa4d-a4a0636cb622\") " pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" Jan 23 17:27:14.668104 kubelet[2900]: I0123 17:27:14.667583 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljlww\" (UniqueName: \"kubernetes.io/projected/5f83fc99-aaad-4d50-822f-b091c1ee0475-kube-api-access-ljlww\") pod \"coredns-66bc5c9577-glh59\" (UID: \"5f83fc99-aaad-4d50-822f-b091c1ee0475\") " pod="kube-system/coredns-66bc5c9577-glh59" Jan 23 17:27:14.668104 kubelet[2900]: I0123 17:27:14.667601 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl45j\" (UniqueName: \"kubernetes.io/projected/972bd28e-0f5b-4146-80fb-f01380586d1b-kube-api-access-pl45j\") pod \"coredns-66bc5c9577-sgfzg\" (UID: \"972bd28e-0f5b-4146-80fb-f01380586d1b\") " pod="kube-system/coredns-66bc5c9577-sgfzg" Jan 23 17:27:14.668714 kubelet[2900]: I0123 17:27:14.668390 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7377c58d-7984-42ea-99c0-3f9fee9f0b42-whisker-ca-bundle\") pod \"whisker-575b58b9b5-hb7t2\" (UID: \"7377c58d-7984-42ea-99c0-3f9fee9f0b42\") " pod="calico-system/whisker-575b58b9b5-hb7t2" Jan 23 17:27:14.668714 kubelet[2900]: I0123 17:27:14.668441 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/bdc6b38c-9de0-4613-a814-03b925209707-calico-apiserver-certs\") pod \"calico-apiserver-dd7cc585d-hrzqq\" (UID: \"bdc6b38c-9de0-4613-a814-03b925209707\") " pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" Jan 23 17:27:14.668714 kubelet[2900]: I0123 17:27:14.668458 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c2kv\" (UniqueName: \"kubernetes.io/projected/bdc6b38c-9de0-4613-a814-03b925209707-kube-api-access-6c2kv\") pod \"calico-apiserver-dd7cc585d-hrzqq\" (UID: \"bdc6b38c-9de0-4613-a814-03b925209707\") " pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" Jan 23 17:27:14.669316 kubelet[2900]: I0123 17:27:14.668746 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/4e4a5b57-4234-49d1-b775-45759cc5cd06-goldmane-key-pair\") pod \"goldmane-7c778bb748-ssq4v\" (UID: \"4e4a5b57-4234-49d1-b775-45759cc5cd06\") " pod="calico-system/goldmane-7c778bb748-ssq4v" Jan 23 17:27:14.669316 kubelet[2900]: I0123 17:27:14.668837 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/972bd28e-0f5b-4146-80fb-f01380586d1b-config-volume\") pod \"coredns-66bc5c9577-sgfzg\" (UID: \"972bd28e-0f5b-4146-80fb-f01380586d1b\") " pod="kube-system/coredns-66bc5c9577-sgfzg" Jan 23 17:27:14.669316 kubelet[2900]: I0123 17:27:14.669077 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e4a5b57-4234-49d1-b775-45759cc5cd06-config\") pod \"goldmane-7c778bb748-ssq4v\" (UID: \"4e4a5b57-4234-49d1-b775-45759cc5cd06\") " pod="calico-system/goldmane-7c778bb748-ssq4v" Jan 23 17:27:14.669316 kubelet[2900]: I0123 17:27:14.669099 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f83fc99-aaad-4d50-822f-b091c1ee0475-config-volume\") pod \"coredns-66bc5c9577-glh59\" (UID: \"5f83fc99-aaad-4d50-822f-b091c1ee0475\") " pod="kube-system/coredns-66bc5c9577-glh59" Jan 23 17:27:14.669316 kubelet[2900]: I0123 17:27:14.669131 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8d63139-4f9a-445f-a5a9-c69b14220343-tigera-ca-bundle\") pod \"calico-kube-controllers-5bcfdccd6d-ph9sw\" (UID: \"d8d63139-4f9a-445f-a5a9-c69b14220343\") " pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" Jan 23 17:27:14.669443 kubelet[2900]: I0123 17:27:14.669154 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/bad712f1-634f-4629-aa4d-a4a0636cb622-calico-apiserver-certs\") pod \"calico-apiserver-dd7cc585d-gp992\" (UID: \"bad712f1-634f-4629-aa4d-a4a0636cb622\") " pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" Jan 23 17:27:14.908155 containerd[1675]: time="2026-01-23T17:27:14.908110671Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5bcfdccd6d-ph9sw,Uid:d8d63139-4f9a-445f-a5a9-c69b14220343,Namespace:calico-system,Attempt:0,}" Jan 23 17:27:14.921538 containerd[1675]: time="2026-01-23T17:27:14.921476736Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-glh59,Uid:5f83fc99-aaad-4d50-822f-b091c1ee0475,Namespace:kube-system,Attempt:0,}" Jan 23 17:27:14.935968 containerd[1675]: time="2026-01-23T17:27:14.935672566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-sgfzg,Uid:972bd28e-0f5b-4146-80fb-f01380586d1b,Namespace:kube-system,Attempt:0,}" Jan 23 17:27:14.938629 containerd[1675]: time="2026-01-23T17:27:14.938553380Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dd7cc585d-gp992,Uid:bad712f1-634f-4629-aa4d-a4a0636cb622,Namespace:calico-apiserver,Attempt:0,}" Jan 23 17:27:14.945582 containerd[1675]: time="2026-01-23T17:27:14.945537295Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-ssq4v,Uid:4e4a5b57-4234-49d1-b775-45759cc5cd06,Namespace:calico-system,Attempt:0,}" Jan 23 17:27:14.961752 containerd[1675]: time="2026-01-23T17:27:14.961644134Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-575b58b9b5-hb7t2,Uid:7377c58d-7984-42ea-99c0-3f9fee9f0b42,Namespace:calico-system,Attempt:0,}" Jan 23 17:27:14.962092 containerd[1675]: time="2026-01-23T17:27:14.961796614Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dd7cc585d-hrzqq,Uid:bdc6b38c-9de0-4613-a814-03b925209707,Namespace:calico-apiserver,Attempt:0,}" Jan 23 17:27:15.027650 containerd[1675]: time="2026-01-23T17:27:15.027530977Z" level=error msg="Failed to destroy network for sandbox \"340eb87a337a74fe2079e3309fc79459b284fdfa7428fb705a3cf1d72ef320c1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.030574 systemd[1]: run-netns-cni\x2d9245da11\x2d2e30\x2ddf13\x2da10d\x2d07d1c8ed286a.mount: Deactivated successfully. Jan 23 17:27:15.033190 containerd[1675]: time="2026-01-23T17:27:15.033117764Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-glh59,Uid:5f83fc99-aaad-4d50-822f-b091c1ee0475,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"340eb87a337a74fe2079e3309fc79459b284fdfa7428fb705a3cf1d72ef320c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.033559 kubelet[2900]: E0123 17:27:15.033472 2900 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"340eb87a337a74fe2079e3309fc79459b284fdfa7428fb705a3cf1d72ef320c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.033559 kubelet[2900]: E0123 17:27:15.033540 2900 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"340eb87a337a74fe2079e3309fc79459b284fdfa7428fb705a3cf1d72ef320c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-glh59" Jan 23 17:27:15.033559 kubelet[2900]: E0123 17:27:15.033559 2900 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"340eb87a337a74fe2079e3309fc79459b284fdfa7428fb705a3cf1d72ef320c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-glh59" Jan 23 17:27:15.033682 kubelet[2900]: E0123 17:27:15.033608 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-glh59_kube-system(5f83fc99-aaad-4d50-822f-b091c1ee0475)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-glh59_kube-system(5f83fc99-aaad-4d50-822f-b091c1ee0475)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"340eb87a337a74fe2079e3309fc79459b284fdfa7428fb705a3cf1d72ef320c1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-glh59" podUID="5f83fc99-aaad-4d50-822f-b091c1ee0475" Jan 23 17:27:15.045164 containerd[1675]: time="2026-01-23T17:27:15.045113383Z" level=error msg="Failed to destroy network for sandbox \"e08df2d2c7a25a5285bb43ba2fab374f7eceeb7d17c008bf50dd20a8d3726823\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.047400 containerd[1675]: time="2026-01-23T17:27:15.047342954Z" level=error msg="Failed to destroy network for sandbox \"95e20279f874b6ca34c7b0bc552719f6e45d586df849c9cc7e74a89217ca13de\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.047565 systemd[1]: run-netns-cni\x2d4d8069e0\x2d0bad\x2dc712\x2d4637\x2d9e8cc4c794be.mount: Deactivated successfully. Jan 23 17:27:15.050943 systemd[1]: run-netns-cni\x2d83795c84\x2d00c4\x2d29e5\x2db428\x2d5824c62a5501.mount: Deactivated successfully. Jan 23 17:27:15.052585 containerd[1675]: time="2026-01-23T17:27:15.052527139Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5bcfdccd6d-ph9sw,Uid:d8d63139-4f9a-445f-a5a9-c69b14220343,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e08df2d2c7a25a5285bb43ba2fab374f7eceeb7d17c008bf50dd20a8d3726823\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.053010 kubelet[2900]: E0123 17:27:15.052956 2900 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e08df2d2c7a25a5285bb43ba2fab374f7eceeb7d17c008bf50dd20a8d3726823\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.053200 kubelet[2900]: E0123 17:27:15.053122 2900 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e08df2d2c7a25a5285bb43ba2fab374f7eceeb7d17c008bf50dd20a8d3726823\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" Jan 23 17:27:15.053200 kubelet[2900]: E0123 17:27:15.053149 2900 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e08df2d2c7a25a5285bb43ba2fab374f7eceeb7d17c008bf50dd20a8d3726823\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" Jan 23 17:27:15.053825 kubelet[2900]: E0123 17:27:15.053356 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5bcfdccd6d-ph9sw_calico-system(d8d63139-4f9a-445f-a5a9-c69b14220343)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5bcfdccd6d-ph9sw_calico-system(d8d63139-4f9a-445f-a5a9-c69b14220343)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e08df2d2c7a25a5285bb43ba2fab374f7eceeb7d17c008bf50dd20a8d3726823\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:27:15.055047 containerd[1675]: time="2026-01-23T17:27:15.054994711Z" level=error msg="Failed to destroy network for sandbox \"2e8af3db9d81004327ae5cf95854fd5061ddb276382e796a9d7821e4a1940294\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.058542 containerd[1675]: time="2026-01-23T17:27:15.058253967Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dd7cc585d-hrzqq,Uid:bdc6b38c-9de0-4613-a814-03b925209707,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"95e20279f874b6ca34c7b0bc552719f6e45d586df849c9cc7e74a89217ca13de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.059089 kubelet[2900]: E0123 17:27:15.058993 2900 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95e20279f874b6ca34c7b0bc552719f6e45d586df849c9cc7e74a89217ca13de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.059089 kubelet[2900]: E0123 17:27:15.059048 2900 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95e20279f874b6ca34c7b0bc552719f6e45d586df849c9cc7e74a89217ca13de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" Jan 23 17:27:15.059089 kubelet[2900]: E0123 17:27:15.059066 2900 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95e20279f874b6ca34c7b0bc552719f6e45d586df849c9cc7e74a89217ca13de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" Jan 23 17:27:15.059448 kubelet[2900]: E0123 17:27:15.059154 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-dd7cc585d-hrzqq_calico-apiserver(bdc6b38c-9de0-4613-a814-03b925209707)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-dd7cc585d-hrzqq_calico-apiserver(bdc6b38c-9de0-4613-a814-03b925209707)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"95e20279f874b6ca34c7b0bc552719f6e45d586df849c9cc7e74a89217ca13de\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:27:15.059933 containerd[1675]: time="2026-01-23T17:27:15.059866535Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-sgfzg,Uid:972bd28e-0f5b-4146-80fb-f01380586d1b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e8af3db9d81004327ae5cf95854fd5061ddb276382e796a9d7821e4a1940294\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.060484 kubelet[2900]: E0123 17:27:15.060211 2900 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e8af3db9d81004327ae5cf95854fd5061ddb276382e796a9d7821e4a1940294\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.061540 kubelet[2900]: E0123 17:27:15.060457 2900 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e8af3db9d81004327ae5cf95854fd5061ddb276382e796a9d7821e4a1940294\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-sgfzg" Jan 23 17:27:15.061540 kubelet[2900]: E0123 17:27:15.061478 2900 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e8af3db9d81004327ae5cf95854fd5061ddb276382e796a9d7821e4a1940294\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-sgfzg" Jan 23 17:27:15.061810 kubelet[2900]: E0123 17:27:15.061725 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-sgfzg_kube-system(972bd28e-0f5b-4146-80fb-f01380586d1b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-sgfzg_kube-system(972bd28e-0f5b-4146-80fb-f01380586d1b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2e8af3db9d81004327ae5cf95854fd5061ddb276382e796a9d7821e4a1940294\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-sgfzg" podUID="972bd28e-0f5b-4146-80fb-f01380586d1b" Jan 23 17:27:15.076472 containerd[1675]: time="2026-01-23T17:27:15.076420697Z" level=error msg="Failed to destroy network for sandbox \"6af64e4d60dc6abe3d52ab6c24394e38c0bafa6b125f63bce62e279dd40bd197\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.079976 containerd[1675]: time="2026-01-23T17:27:15.079925474Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dd7cc585d-gp992,Uid:bad712f1-634f-4629-aa4d-a4a0636cb622,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6af64e4d60dc6abe3d52ab6c24394e38c0bafa6b125f63bce62e279dd40bd197\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.081280 kubelet[2900]: E0123 17:27:15.080976 2900 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6af64e4d60dc6abe3d52ab6c24394e38c0bafa6b125f63bce62e279dd40bd197\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.081597 kubelet[2900]: E0123 17:27:15.081326 2900 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6af64e4d60dc6abe3d52ab6c24394e38c0bafa6b125f63bce62e279dd40bd197\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" Jan 23 17:27:15.081597 kubelet[2900]: E0123 17:27:15.081351 2900 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6af64e4d60dc6abe3d52ab6c24394e38c0bafa6b125f63bce62e279dd40bd197\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" Jan 23 17:27:15.081597 kubelet[2900]: E0123 17:27:15.081405 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-dd7cc585d-gp992_calico-apiserver(bad712f1-634f-4629-aa4d-a4a0636cb622)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-dd7cc585d-gp992_calico-apiserver(bad712f1-634f-4629-aa4d-a4a0636cb622)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6af64e4d60dc6abe3d52ab6c24394e38c0bafa6b125f63bce62e279dd40bd197\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:27:15.084261 containerd[1675]: time="2026-01-23T17:27:15.084216735Z" level=error msg="Failed to destroy network for sandbox \"6cdbcb9e01bae7ae728b245671704a3d22f5497817644e593616e4630242ba13\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.086875 containerd[1675]: time="2026-01-23T17:27:15.086809548Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-ssq4v,Uid:4e4a5b57-4234-49d1-b775-45759cc5cd06,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6cdbcb9e01bae7ae728b245671704a3d22f5497817644e593616e4630242ba13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.087201 kubelet[2900]: E0123 17:27:15.087160 2900 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6cdbcb9e01bae7ae728b245671704a3d22f5497817644e593616e4630242ba13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.087333 kubelet[2900]: E0123 17:27:15.087292 2900 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6cdbcb9e01bae7ae728b245671704a3d22f5497817644e593616e4630242ba13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-ssq4v" Jan 23 17:27:15.087483 kubelet[2900]: E0123 17:27:15.087420 2900 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6cdbcb9e01bae7ae728b245671704a3d22f5497817644e593616e4630242ba13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-ssq4v" Jan 23 17:27:15.087575 kubelet[2900]: E0123 17:27:15.087545 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-ssq4v_calico-system(4e4a5b57-4234-49d1-b775-45759cc5cd06)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-ssq4v_calico-system(4e4a5b57-4234-49d1-b775-45759cc5cd06)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6cdbcb9e01bae7ae728b245671704a3d22f5497817644e593616e4630242ba13\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:27:15.094606 containerd[1675]: time="2026-01-23T17:27:15.094551946Z" level=error msg="Failed to destroy network for sandbox \"72e4eb9b50cce4bc9970a48a335cc54edda77a049416acba5197b0793cef3ad7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.096872 containerd[1675]: time="2026-01-23T17:27:15.096814117Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-575b58b9b5-hb7t2,Uid:7377c58d-7984-42ea-99c0-3f9fee9f0b42,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"72e4eb9b50cce4bc9970a48a335cc54edda77a049416acba5197b0793cef3ad7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.097077 kubelet[2900]: E0123 17:27:15.097043 2900 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72e4eb9b50cce4bc9970a48a335cc54edda77a049416acba5197b0793cef3ad7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.097167 kubelet[2900]: E0123 17:27:15.097096 2900 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72e4eb9b50cce4bc9970a48a335cc54edda77a049416acba5197b0793cef3ad7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-575b58b9b5-hb7t2" Jan 23 17:27:15.097167 kubelet[2900]: E0123 17:27:15.097115 2900 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72e4eb9b50cce4bc9970a48a335cc54edda77a049416acba5197b0793cef3ad7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-575b58b9b5-hb7t2" Jan 23 17:27:15.097254 kubelet[2900]: E0123 17:27:15.097167 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-575b58b9b5-hb7t2_calico-system(7377c58d-7984-42ea-99c0-3f9fee9f0b42)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-575b58b9b5-hb7t2_calico-system(7377c58d-7984-42ea-99c0-3f9fee9f0b42)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"72e4eb9b50cce4bc9970a48a335cc54edda77a049416acba5197b0793cef3ad7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-575b58b9b5-hb7t2" podUID="7377c58d-7984-42ea-99c0-3f9fee9f0b42" Jan 23 17:27:15.565361 systemd[1]: Created slice kubepods-besteffort-podbaa2fc5a_f231_4d23_992e_37e27c865a7c.slice - libcontainer container kubepods-besteffort-podbaa2fc5a_f231_4d23_992e_37e27c865a7c.slice. Jan 23 17:27:15.569879 containerd[1675]: time="2026-01-23T17:27:15.569841277Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p68bw,Uid:baa2fc5a-f231-4d23-992e-37e27c865a7c,Namespace:calico-system,Attempt:0,}" Jan 23 17:27:15.612177 containerd[1675]: time="2026-01-23T17:27:15.612113884Z" level=error msg="Failed to destroy network for sandbox \"62d66f981ec132093a28a0c3404d7c18a373676a3569392ba8597db7d41b8bee\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.614052 containerd[1675]: time="2026-01-23T17:27:15.614004974Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p68bw,Uid:baa2fc5a-f231-4d23-992e-37e27c865a7c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"62d66f981ec132093a28a0c3404d7c18a373676a3569392ba8597db7d41b8bee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.614332 kubelet[2900]: E0123 17:27:15.614266 2900 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62d66f981ec132093a28a0c3404d7c18a373676a3569392ba8597db7d41b8bee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:15.614381 kubelet[2900]: E0123 17:27:15.614332 2900 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62d66f981ec132093a28a0c3404d7c18a373676a3569392ba8597db7d41b8bee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-p68bw" Jan 23 17:27:15.614381 kubelet[2900]: E0123 17:27:15.614351 2900 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62d66f981ec132093a28a0c3404d7c18a373676a3569392ba8597db7d41b8bee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-p68bw" Jan 23 17:27:15.614444 kubelet[2900]: E0123 17:27:15.614423 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-p68bw_calico-system(baa2fc5a-f231-4d23-992e-37e27c865a7c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-p68bw_calico-system(baa2fc5a-f231-4d23-992e-37e27c865a7c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"62d66f981ec132093a28a0c3404d7c18a373676a3569392ba8597db7d41b8bee\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:27:15.973193 systemd[1]: run-netns-cni\x2dd6c0879f\x2d7c44\x2d496a\x2d0c3e\x2d7fb5c1c68015.mount: Deactivated successfully. Jan 23 17:27:15.973291 systemd[1]: run-netns-cni\x2d20fc4681\x2ddf26\x2d3cb1\x2dc256\x2def872260fc22.mount: Deactivated successfully. Jan 23 17:27:15.973367 systemd[1]: run-netns-cni\x2d30ef1953\x2d6438\x2d3c7e\x2d8f84\x2d3d832dfd5fdd.mount: Deactivated successfully. Jan 23 17:27:15.973411 systemd[1]: run-netns-cni\x2dc3669bb0\x2d8742\x2db080\x2dfbf5\x2df4dff6ffacf6.mount: Deactivated successfully. Jan 23 17:27:18.869044 kubelet[2900]: I0123 17:27:18.868562 2900 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 23 17:27:18.892000 audit[3999]: NETFILTER_CFG table=filter:117 family=2 entries=21 op=nft_register_rule pid=3999 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:18.892000 audit[3999]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffc907b440 a2=0 a3=1 items=0 ppid=3013 pid=3999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:18.892000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:18.904000 audit[3999]: NETFILTER_CFG table=nat:118 family=2 entries=19 op=nft_register_chain pid=3999 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:18.904000 audit[3999]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=ffffc907b440 a2=0 a3=1 items=0 ppid=3013 pid=3999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:18.904000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:25.568100 containerd[1675]: time="2026-01-23T17:27:25.567880003Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-ssq4v,Uid:4e4a5b57-4234-49d1-b775-45759cc5cd06,Namespace:calico-system,Attempt:0,}" Jan 23 17:27:25.619145 containerd[1675]: time="2026-01-23T17:27:25.618794013Z" level=error msg="Failed to destroy network for sandbox \"6c35135245c4b8ca5d3fc9660aa32b9c5491a7afe01f6b03d55d6ef8183c1d4f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:25.621025 systemd[1]: run-netns-cni\x2d4bdacae9\x2dd1d1\x2d59b5\x2d35a4\x2d85960b0f05e9.mount: Deactivated successfully. Jan 23 17:27:25.623318 containerd[1675]: time="2026-01-23T17:27:25.623236634Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-ssq4v,Uid:4e4a5b57-4234-49d1-b775-45759cc5cd06,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c35135245c4b8ca5d3fc9660aa32b9c5491a7afe01f6b03d55d6ef8183c1d4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:25.623520 kubelet[2900]: E0123 17:27:25.623466 2900 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c35135245c4b8ca5d3fc9660aa32b9c5491a7afe01f6b03d55d6ef8183c1d4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:25.623754 kubelet[2900]: E0123 17:27:25.623524 2900 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c35135245c4b8ca5d3fc9660aa32b9c5491a7afe01f6b03d55d6ef8183c1d4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-ssq4v" Jan 23 17:27:25.623754 kubelet[2900]: E0123 17:27:25.623544 2900 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c35135245c4b8ca5d3fc9660aa32b9c5491a7afe01f6b03d55d6ef8183c1d4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-ssq4v" Jan 23 17:27:25.623754 kubelet[2900]: E0123 17:27:25.623596 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-ssq4v_calico-system(4e4a5b57-4234-49d1-b775-45759cc5cd06)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-ssq4v_calico-system(4e4a5b57-4234-49d1-b775-45759cc5cd06)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6c35135245c4b8ca5d3fc9660aa32b9c5491a7afe01f6b03d55d6ef8183c1d4f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:27:26.461436 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4191499303.mount: Deactivated successfully. Jan 23 17:27:26.480784 containerd[1675]: time="2026-01-23T17:27:26.480667360Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:27:26.481620 containerd[1675]: time="2026-01-23T17:27:26.481578685Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150930912" Jan 23 17:27:26.482849 containerd[1675]: time="2026-01-23T17:27:26.482788331Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:27:26.484913 containerd[1675]: time="2026-01-23T17:27:26.484843261Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 23 17:27:26.485944 containerd[1675]: time="2026-01-23T17:27:26.485875906Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 11.823028678s" Jan 23 17:27:26.485944 containerd[1675]: time="2026-01-23T17:27:26.485908186Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Jan 23 17:27:26.502341 containerd[1675]: time="2026-01-23T17:27:26.501459622Z" level=info msg="CreateContainer within sandbox \"062a64adfc7f0a7ed4d5582cbb439ddac258ae5247aef948bb2508dfb4d60af5\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 23 17:27:26.509534 containerd[1675]: time="2026-01-23T17:27:26.509498702Z" level=info msg="Container 7cd18b9adde81e117812dc79c8737516aea3d7c843d32f3ac0da66d07ffecd8a: CDI devices from CRI Config.CDIDevices: []" Jan 23 17:27:26.519798 containerd[1675]: time="2026-01-23T17:27:26.519756712Z" level=info msg="CreateContainer within sandbox \"062a64adfc7f0a7ed4d5582cbb439ddac258ae5247aef948bb2508dfb4d60af5\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"7cd18b9adde81e117812dc79c8737516aea3d7c843d32f3ac0da66d07ffecd8a\"" Jan 23 17:27:26.520590 containerd[1675]: time="2026-01-23T17:27:26.520543276Z" level=info msg="StartContainer for \"7cd18b9adde81e117812dc79c8737516aea3d7c843d32f3ac0da66d07ffecd8a\"" Jan 23 17:27:26.522296 containerd[1675]: time="2026-01-23T17:27:26.522228724Z" level=info msg="connecting to shim 7cd18b9adde81e117812dc79c8737516aea3d7c843d32f3ac0da66d07ffecd8a" address="unix:///run/containerd/s/dfef6d6a2f2ba92efbc359d8eb258a049469366905ea88edc501cc5c196a5241" protocol=ttrpc version=3 Jan 23 17:27:26.543737 systemd[1]: Started cri-containerd-7cd18b9adde81e117812dc79c8737516aea3d7c843d32f3ac0da66d07ffecd8a.scope - libcontainer container 7cd18b9adde81e117812dc79c8737516aea3d7c843d32f3ac0da66d07ffecd8a. Jan 23 17:27:26.563786 containerd[1675]: time="2026-01-23T17:27:26.563714048Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dd7cc585d-gp992,Uid:bad712f1-634f-4629-aa4d-a4a0636cb622,Namespace:calico-apiserver,Attempt:0,}" Jan 23 17:27:26.566639 containerd[1675]: time="2026-01-23T17:27:26.566582662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dd7cc585d-hrzqq,Uid:bdc6b38c-9de0-4613-a814-03b925209707,Namespace:calico-apiserver,Attempt:0,}" Jan 23 17:27:26.588000 audit: BPF prog-id=172 op=LOAD Jan 23 17:27:26.589947 kernel: kauditd_printk_skb: 12 callbacks suppressed Jan 23 17:27:26.590066 kernel: audit: type=1334 audit(1769189246.588:571): prog-id=172 op=LOAD Jan 23 17:27:26.590119 kernel: audit: type=1300 audit(1769189246.588:571): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3427 pid=4042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:26.588000 audit[4042]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3427 pid=4042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:26.588000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763643138623961646465383165313137383132646337396338373337 Jan 23 17:27:26.595838 kernel: audit: type=1327 audit(1769189246.588:571): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763643138623961646465383165313137383132646337396338373337 Jan 23 17:27:26.588000 audit: BPF prog-id=173 op=LOAD Jan 23 17:27:26.597018 kernel: audit: type=1334 audit(1769189246.588:572): prog-id=173 op=LOAD Jan 23 17:27:26.597075 kernel: audit: type=1300 audit(1769189246.588:572): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3427 pid=4042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:26.588000 audit[4042]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3427 pid=4042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:26.588000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763643138623961646465383165313137383132646337396338373337 Jan 23 17:27:26.603438 kernel: audit: type=1327 audit(1769189246.588:572): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763643138623961646465383165313137383132646337396338373337 Jan 23 17:27:26.603512 kernel: audit: type=1334 audit(1769189246.589:573): prog-id=173 op=UNLOAD Jan 23 17:27:26.589000 audit: BPF prog-id=173 op=UNLOAD Jan 23 17:27:26.589000 audit[4042]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3427 pid=4042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:26.607208 kernel: audit: type=1300 audit(1769189246.589:573): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3427 pid=4042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:26.589000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763643138623961646465383165313137383132646337396338373337 Jan 23 17:27:26.611227 kernel: audit: type=1327 audit(1769189246.589:573): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763643138623961646465383165313137383132646337396338373337 Jan 23 17:27:26.589000 audit: BPF prog-id=172 op=UNLOAD Jan 23 17:27:26.612513 kernel: audit: type=1334 audit(1769189246.589:574): prog-id=172 op=UNLOAD Jan 23 17:27:26.589000 audit[4042]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3427 pid=4042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:26.589000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763643138623961646465383165313137383132646337396338373337 Jan 23 17:27:26.589000 audit: BPF prog-id=174 op=LOAD Jan 23 17:27:26.589000 audit[4042]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3427 pid=4042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:26.589000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763643138623961646465383165313137383132646337396338373337 Jan 23 17:27:26.632878 containerd[1675]: time="2026-01-23T17:27:26.632835227Z" level=info msg="StartContainer for \"7cd18b9adde81e117812dc79c8737516aea3d7c843d32f3ac0da66d07ffecd8a\" returns successfully" Jan 23 17:27:26.645047 containerd[1675]: time="2026-01-23T17:27:26.644932966Z" level=error msg="Failed to destroy network for sandbox \"efd76c46fb5b66641491582cf2688ab62c908ad09271b6f9631bc9c15ff13056\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:26.648156 containerd[1675]: time="2026-01-23T17:27:26.648086742Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dd7cc585d-hrzqq,Uid:bdc6b38c-9de0-4613-a814-03b925209707,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"efd76c46fb5b66641491582cf2688ab62c908ad09271b6f9631bc9c15ff13056\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:26.648530 kubelet[2900]: E0123 17:27:26.648368 2900 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efd76c46fb5b66641491582cf2688ab62c908ad09271b6f9631bc9c15ff13056\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:26.648530 kubelet[2900]: E0123 17:27:26.648440 2900 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efd76c46fb5b66641491582cf2688ab62c908ad09271b6f9631bc9c15ff13056\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" Jan 23 17:27:26.648530 kubelet[2900]: E0123 17:27:26.648483 2900 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efd76c46fb5b66641491582cf2688ab62c908ad09271b6f9631bc9c15ff13056\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" Jan 23 17:27:26.648848 containerd[1675]: time="2026-01-23T17:27:26.648428543Z" level=error msg="Failed to destroy network for sandbox \"5db131c3f7fff54ef2fa6656bf5b0a6e8db42bb42fc91d08b1f99de3e18634e0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:26.648877 kubelet[2900]: E0123 17:27:26.648542 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-dd7cc585d-hrzqq_calico-apiserver(bdc6b38c-9de0-4613-a814-03b925209707)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-dd7cc585d-hrzqq_calico-apiserver(bdc6b38c-9de0-4613-a814-03b925209707)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"efd76c46fb5b66641491582cf2688ab62c908ad09271b6f9631bc9c15ff13056\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:27:26.651547 containerd[1675]: time="2026-01-23T17:27:26.651448238Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dd7cc585d-gp992,Uid:bad712f1-634f-4629-aa4d-a4a0636cb622,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5db131c3f7fff54ef2fa6656bf5b0a6e8db42bb42fc91d08b1f99de3e18634e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:26.651724 kubelet[2900]: E0123 17:27:26.651648 2900 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5db131c3f7fff54ef2fa6656bf5b0a6e8db42bb42fc91d08b1f99de3e18634e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 23 17:27:26.651829 kubelet[2900]: E0123 17:27:26.651698 2900 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5db131c3f7fff54ef2fa6656bf5b0a6e8db42bb42fc91d08b1f99de3e18634e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" Jan 23 17:27:26.651829 kubelet[2900]: E0123 17:27:26.651748 2900 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5db131c3f7fff54ef2fa6656bf5b0a6e8db42bb42fc91d08b1f99de3e18634e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" Jan 23 17:27:26.651877 kubelet[2900]: E0123 17:27:26.651821 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-dd7cc585d-gp992_calico-apiserver(bad712f1-634f-4629-aa4d-a4a0636cb622)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-dd7cc585d-gp992_calico-apiserver(bad712f1-634f-4629-aa4d-a4a0636cb622)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5db131c3f7fff54ef2fa6656bf5b0a6e8db42bb42fc91d08b1f99de3e18634e0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:27:26.792133 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 23 17:27:26.792246 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 23 17:27:26.899333 kubelet[2900]: I0123 17:27:26.899030 2900 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-km4sx" podStartSLOduration=1.128640604 podStartE2EDuration="21.899013533s" podCreationTimestamp="2026-01-23 17:27:05 +0000 UTC" firstStartedPulling="2026-01-23 17:27:05.716366381 +0000 UTC m=+23.251910073" lastFinishedPulling="2026-01-23 17:27:26.48673931 +0000 UTC m=+44.022283002" observedRunningTime="2026-01-23 17:27:26.7088344 +0000 UTC m=+44.244378092" watchObservedRunningTime="2026-01-23 17:27:26.899013533 +0000 UTC m=+44.434557185" Jan 23 17:27:26.950930 kubelet[2900]: I0123 17:27:26.950886 2900 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frq95\" (UniqueName: \"kubernetes.io/projected/7377c58d-7984-42ea-99c0-3f9fee9f0b42-kube-api-access-frq95\") pod \"7377c58d-7984-42ea-99c0-3f9fee9f0b42\" (UID: \"7377c58d-7984-42ea-99c0-3f9fee9f0b42\") " Jan 23 17:27:26.950930 kubelet[2900]: I0123 17:27:26.950938 2900 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7377c58d-7984-42ea-99c0-3f9fee9f0b42-whisker-backend-key-pair\") pod \"7377c58d-7984-42ea-99c0-3f9fee9f0b42\" (UID: \"7377c58d-7984-42ea-99c0-3f9fee9f0b42\") " Jan 23 17:27:26.951096 kubelet[2900]: I0123 17:27:26.950960 2900 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7377c58d-7984-42ea-99c0-3f9fee9f0b42-whisker-ca-bundle\") pod \"7377c58d-7984-42ea-99c0-3f9fee9f0b42\" (UID: \"7377c58d-7984-42ea-99c0-3f9fee9f0b42\") " Jan 23 17:27:26.951424 kubelet[2900]: I0123 17:27:26.951297 2900 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7377c58d-7984-42ea-99c0-3f9fee9f0b42-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "7377c58d-7984-42ea-99c0-3f9fee9f0b42" (UID: "7377c58d-7984-42ea-99c0-3f9fee9f0b42"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 23 17:27:26.953618 kubelet[2900]: I0123 17:27:26.953577 2900 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7377c58d-7984-42ea-99c0-3f9fee9f0b42-kube-api-access-frq95" (OuterVolumeSpecName: "kube-api-access-frq95") pod "7377c58d-7984-42ea-99c0-3f9fee9f0b42" (UID: "7377c58d-7984-42ea-99c0-3f9fee9f0b42"). InnerVolumeSpecName "kube-api-access-frq95". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 23 17:27:26.953829 kubelet[2900]: I0123 17:27:26.953785 2900 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7377c58d-7984-42ea-99c0-3f9fee9f0b42-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "7377c58d-7984-42ea-99c0-3f9fee9f0b42" (UID: "7377c58d-7984-42ea-99c0-3f9fee9f0b42"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 23 17:27:27.052426 kubelet[2900]: I0123 17:27:27.051614 2900 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-frq95\" (UniqueName: \"kubernetes.io/projected/7377c58d-7984-42ea-99c0-3f9fee9f0b42-kube-api-access-frq95\") on node \"ci-4547-1-0-a-d0877fd079\" DevicePath \"\"" Jan 23 17:27:27.052426 kubelet[2900]: I0123 17:27:27.051645 2900 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7377c58d-7984-42ea-99c0-3f9fee9f0b42-whisker-backend-key-pair\") on node \"ci-4547-1-0-a-d0877fd079\" DevicePath \"\"" Jan 23 17:27:27.052426 kubelet[2900]: I0123 17:27:27.051654 2900 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7377c58d-7984-42ea-99c0-3f9fee9f0b42-whisker-ca-bundle\") on node \"ci-4547-1-0-a-d0877fd079\" DevicePath \"\"" Jan 23 17:27:27.462373 systemd[1]: var-lib-kubelet-pods-7377c58d\x2d7984\x2d42ea\x2d99c0\x2d3f9fee9f0b42-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dfrq95.mount: Deactivated successfully. Jan 23 17:27:27.462461 systemd[1]: var-lib-kubelet-pods-7377c58d\x2d7984\x2d42ea\x2d99c0\x2d3f9fee9f0b42-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 23 17:27:27.563459 containerd[1675]: time="2026-01-23T17:27:27.563415872Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-sgfzg,Uid:972bd28e-0f5b-4146-80fb-f01380586d1b,Namespace:kube-system,Attempt:0,}" Jan 23 17:27:27.696377 systemd[1]: Removed slice kubepods-besteffort-pod7377c58d_7984_42ea_99c0_3f9fee9f0b42.slice - libcontainer container kubepods-besteffort-pod7377c58d_7984_42ea_99c0_3f9fee9f0b42.slice. Jan 23 17:27:27.702701 systemd-networkd[1585]: calif2ac5e9c25b: Link UP Jan 23 17:27:27.702853 systemd-networkd[1585]: calif2ac5e9c25b: Gained carrier Jan 23 17:27:27.718809 containerd[1675]: 2026-01-23 17:27:27.584 [INFO][4196] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 23 17:27:27.718809 containerd[1675]: 2026-01-23 17:27:27.604 [INFO][4196] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--sgfzg-eth0 coredns-66bc5c9577- kube-system 972bd28e-0f5b-4146-80fb-f01380586d1b 840 0 2026-01-23 17:26:48 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-1-0-a-d0877fd079 coredns-66bc5c9577-sgfzg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif2ac5e9c25b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" Namespace="kube-system" Pod="coredns-66bc5c9577-sgfzg" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--sgfzg-" Jan 23 17:27:27.718809 containerd[1675]: 2026-01-23 17:27:27.604 [INFO][4196] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" Namespace="kube-system" Pod="coredns-66bc5c9577-sgfzg" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--sgfzg-eth0" Jan 23 17:27:27.718809 containerd[1675]: 2026-01-23 17:27:27.650 [INFO][4211] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" HandleID="k8s-pod-network.ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" Workload="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--sgfzg-eth0" Jan 23 17:27:27.719551 containerd[1675]: 2026-01-23 17:27:27.650 [INFO][4211] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" HandleID="k8s-pod-network.ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" Workload="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--sgfzg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c280), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-1-0-a-d0877fd079", "pod":"coredns-66bc5c9577-sgfzg", "timestamp":"2026-01-23 17:27:27.650282898 +0000 UTC"}, Hostname:"ci-4547-1-0-a-d0877fd079", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 17:27:27.719551 containerd[1675]: 2026-01-23 17:27:27.650 [INFO][4211] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 17:27:27.719551 containerd[1675]: 2026-01-23 17:27:27.650 [INFO][4211] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 17:27:27.719551 containerd[1675]: 2026-01-23 17:27:27.650 [INFO][4211] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-a-d0877fd079' Jan 23 17:27:27.719551 containerd[1675]: 2026-01-23 17:27:27.660 [INFO][4211] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:27.719551 containerd[1675]: 2026-01-23 17:27:27.665 [INFO][4211] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:27.719551 containerd[1675]: 2026-01-23 17:27:27.670 [INFO][4211] ipam/ipam.go 511: Trying affinity for 192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:27.719551 containerd[1675]: 2026-01-23 17:27:27.671 [INFO][4211] ipam/ipam.go 158: Attempting to load block cidr=192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:27.719551 containerd[1675]: 2026-01-23 17:27:27.677 [INFO][4211] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:27.720316 containerd[1675]: 2026-01-23 17:27:27.677 [INFO][4211] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.16.0/26 handle="k8s-pod-network.ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:27.720316 containerd[1675]: 2026-01-23 17:27:27.679 [INFO][4211] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca Jan 23 17:27:27.720316 containerd[1675]: 2026-01-23 17:27:27.683 [INFO][4211] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.16.0/26 handle="k8s-pod-network.ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:27.720316 containerd[1675]: 2026-01-23 17:27:27.689 [INFO][4211] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.16.1/26] block=192.168.16.0/26 handle="k8s-pod-network.ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:27.720316 containerd[1675]: 2026-01-23 17:27:27.689 [INFO][4211] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.16.1/26] handle="k8s-pod-network.ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:27.720316 containerd[1675]: 2026-01-23 17:27:27.689 [INFO][4211] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 17:27:27.720316 containerd[1675]: 2026-01-23 17:27:27.689 [INFO][4211] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.16.1/26] IPv6=[] ContainerID="ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" HandleID="k8s-pod-network.ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" Workload="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--sgfzg-eth0" Jan 23 17:27:27.720739 containerd[1675]: 2026-01-23 17:27:27.695 [INFO][4196] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" Namespace="kube-system" Pod="coredns-66bc5c9577-sgfzg" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--sgfzg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--sgfzg-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"972bd28e-0f5b-4146-80fb-f01380586d1b", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 17, 26, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-a-d0877fd079", ContainerID:"", Pod:"coredns-66bc5c9577-sgfzg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.16.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif2ac5e9c25b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 17:27:27.720739 containerd[1675]: 2026-01-23 17:27:27.695 [INFO][4196] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.16.1/32] ContainerID="ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" Namespace="kube-system" Pod="coredns-66bc5c9577-sgfzg" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--sgfzg-eth0" Jan 23 17:27:27.720739 containerd[1675]: 2026-01-23 17:27:27.695 [INFO][4196] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif2ac5e9c25b ContainerID="ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" Namespace="kube-system" Pod="coredns-66bc5c9577-sgfzg" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--sgfzg-eth0" Jan 23 17:27:27.720739 containerd[1675]: 2026-01-23 17:27:27.703 [INFO][4196] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" Namespace="kube-system" Pod="coredns-66bc5c9577-sgfzg" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--sgfzg-eth0" Jan 23 17:27:27.720739 containerd[1675]: 2026-01-23 17:27:27.703 [INFO][4196] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" Namespace="kube-system" Pod="coredns-66bc5c9577-sgfzg" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--sgfzg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--sgfzg-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"972bd28e-0f5b-4146-80fb-f01380586d1b", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 17, 26, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-a-d0877fd079", ContainerID:"ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca", Pod:"coredns-66bc5c9577-sgfzg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.16.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif2ac5e9c25b", MAC:"c2:ea:d6:c3:0c:8b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 17:27:27.721172 containerd[1675]: 2026-01-23 17:27:27.716 [INFO][4196] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" Namespace="kube-system" Pod="coredns-66bc5c9577-sgfzg" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--sgfzg-eth0" Jan 23 17:27:27.755996 containerd[1675]: time="2026-01-23T17:27:27.755919536Z" level=info msg="connecting to shim ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca" address="unix:///run/containerd/s/4e7c3bbbd6c3a651fa24014b7438bc348eb959ef51e202b53eaf24cc85f1746b" namespace=k8s.io protocol=ttrpc version=3 Jan 23 17:27:27.775113 systemd[1]: Created slice kubepods-besteffort-pod348a8202_26ae_4e13_80ce_0b3caa537034.slice - libcontainer container kubepods-besteffort-pod348a8202_26ae_4e13_80ce_0b3caa537034.slice. Jan 23 17:27:27.791714 systemd[1]: Started cri-containerd-ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca.scope - libcontainer container ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca. Jan 23 17:27:27.804000 audit: BPF prog-id=175 op=LOAD Jan 23 17:27:27.804000 audit: BPF prog-id=176 op=LOAD Jan 23 17:27:27.804000 audit[4272]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=4258 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:27.804000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365316661316639623663393638663831346361326265343961616534 Jan 23 17:27:27.804000 audit: BPF prog-id=176 op=UNLOAD Jan 23 17:27:27.804000 audit[4272]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4258 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:27.804000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365316661316639623663393638663831346361326265343961616534 Jan 23 17:27:27.805000 audit: BPF prog-id=177 op=LOAD Jan 23 17:27:27.805000 audit[4272]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=4258 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:27.805000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365316661316639623663393638663831346361326265343961616534 Jan 23 17:27:27.805000 audit: BPF prog-id=178 op=LOAD Jan 23 17:27:27.805000 audit[4272]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=4258 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:27.805000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365316661316639623663393638663831346361326265343961616534 Jan 23 17:27:27.805000 audit: BPF prog-id=178 op=UNLOAD Jan 23 17:27:27.805000 audit[4272]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4258 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:27.805000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365316661316639623663393638663831346361326265343961616534 Jan 23 17:27:27.805000 audit: BPF prog-id=177 op=UNLOAD Jan 23 17:27:27.805000 audit[4272]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4258 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:27.805000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365316661316639623663393638663831346361326265343961616534 Jan 23 17:27:27.805000 audit: BPF prog-id=179 op=LOAD Jan 23 17:27:27.805000 audit[4272]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=4258 pid=4272 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:27.805000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365316661316639623663393638663831346361326265343961616534 Jan 23 17:27:27.826716 containerd[1675]: time="2026-01-23T17:27:27.826672243Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-sgfzg,Uid:972bd28e-0f5b-4146-80fb-f01380586d1b,Namespace:kube-system,Attempt:0,} returns sandbox id \"ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca\"" Jan 23 17:27:27.831933 containerd[1675]: time="2026-01-23T17:27:27.831892509Z" level=info msg="CreateContainer within sandbox \"ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 23 17:27:27.841758 containerd[1675]: time="2026-01-23T17:27:27.841714517Z" level=info msg="Container 2e4b4b1c6875206ebdbfb089456d737dd2ee7ef5a30e3b36e9deb4e62a3f824e: CDI devices from CRI Config.CDIDevices: []" Jan 23 17:27:27.848612 containerd[1675]: time="2026-01-23T17:27:27.848574111Z" level=info msg="CreateContainer within sandbox \"ce1fa1f9b6c968f814ca2be49aae44bbd1e2233f54c589f56a55417104987bca\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"2e4b4b1c6875206ebdbfb089456d737dd2ee7ef5a30e3b36e9deb4e62a3f824e\"" Jan 23 17:27:27.849462 containerd[1675]: time="2026-01-23T17:27:27.849435195Z" level=info msg="StartContainer for \"2e4b4b1c6875206ebdbfb089456d737dd2ee7ef5a30e3b36e9deb4e62a3f824e\"" Jan 23 17:27:27.850291 containerd[1675]: time="2026-01-23T17:27:27.850267359Z" level=info msg="connecting to shim 2e4b4b1c6875206ebdbfb089456d737dd2ee7ef5a30e3b36e9deb4e62a3f824e" address="unix:///run/containerd/s/4e7c3bbbd6c3a651fa24014b7438bc348eb959ef51e202b53eaf24cc85f1746b" protocol=ttrpc version=3 Jan 23 17:27:27.857743 kubelet[2900]: I0123 17:27:27.857667 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/348a8202-26ae-4e13-80ce-0b3caa537034-whisker-ca-bundle\") pod \"whisker-68b765b4fc-4fz2b\" (UID: \"348a8202-26ae-4e13-80ce-0b3caa537034\") " pod="calico-system/whisker-68b765b4fc-4fz2b" Jan 23 17:27:27.857743 kubelet[2900]: I0123 17:27:27.857716 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5pxr\" (UniqueName: \"kubernetes.io/projected/348a8202-26ae-4e13-80ce-0b3caa537034-kube-api-access-s5pxr\") pod \"whisker-68b765b4fc-4fz2b\" (UID: \"348a8202-26ae-4e13-80ce-0b3caa537034\") " pod="calico-system/whisker-68b765b4fc-4fz2b" Jan 23 17:27:27.858072 kubelet[2900]: I0123 17:27:27.857839 2900 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/348a8202-26ae-4e13-80ce-0b3caa537034-whisker-backend-key-pair\") pod \"whisker-68b765b4fc-4fz2b\" (UID: \"348a8202-26ae-4e13-80ce-0b3caa537034\") " pod="calico-system/whisker-68b765b4fc-4fz2b" Jan 23 17:27:27.877119 systemd[1]: Started cri-containerd-2e4b4b1c6875206ebdbfb089456d737dd2ee7ef5a30e3b36e9deb4e62a3f824e.scope - libcontainer container 2e4b4b1c6875206ebdbfb089456d737dd2ee7ef5a30e3b36e9deb4e62a3f824e. Jan 23 17:27:27.889000 audit: BPF prog-id=180 op=LOAD Jan 23 17:27:27.890000 audit: BPF prog-id=181 op=LOAD Jan 23 17:27:27.890000 audit[4297]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4258 pid=4297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:27.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265346234623163363837353230366562646266623038393435366437 Jan 23 17:27:27.890000 audit: BPF prog-id=181 op=UNLOAD Jan 23 17:27:27.890000 audit[4297]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4258 pid=4297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:27.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265346234623163363837353230366562646266623038393435366437 Jan 23 17:27:27.890000 audit: BPF prog-id=182 op=LOAD Jan 23 17:27:27.890000 audit[4297]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4258 pid=4297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:27.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265346234623163363837353230366562646266623038393435366437 Jan 23 17:27:27.890000 audit: BPF prog-id=183 op=LOAD Jan 23 17:27:27.890000 audit[4297]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4258 pid=4297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:27.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265346234623163363837353230366562646266623038393435366437 Jan 23 17:27:27.890000 audit: BPF prog-id=183 op=UNLOAD Jan 23 17:27:27.890000 audit[4297]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4258 pid=4297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:27.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265346234623163363837353230366562646266623038393435366437 Jan 23 17:27:27.890000 audit: BPF prog-id=182 op=UNLOAD Jan 23 17:27:27.890000 audit[4297]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4258 pid=4297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:27.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265346234623163363837353230366562646266623038393435366437 Jan 23 17:27:27.890000 audit: BPF prog-id=184 op=LOAD Jan 23 17:27:27.890000 audit[4297]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4258 pid=4297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:27.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265346234623163363837353230366562646266623038393435366437 Jan 23 17:27:27.905883 containerd[1675]: time="2026-01-23T17:27:27.905783831Z" level=info msg="StartContainer for \"2e4b4b1c6875206ebdbfb089456d737dd2ee7ef5a30e3b36e9deb4e62a3f824e\" returns successfully" Jan 23 17:27:28.085465 containerd[1675]: time="2026-01-23T17:27:28.085264712Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-68b765b4fc-4fz2b,Uid:348a8202-26ae-4e13-80ce-0b3caa537034,Namespace:calico-system,Attempt:0,}" Jan 23 17:27:28.231320 systemd-networkd[1585]: cali5755824dd20: Link UP Jan 23 17:27:28.232156 systemd-networkd[1585]: cali5755824dd20: Gained carrier Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.116 [INFO][4359] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.132 [INFO][4359] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--a--d0877fd079-k8s-whisker--68b765b4fc--4fz2b-eth0 whisker-68b765b4fc- calico-system 348a8202-26ae-4e13-80ce-0b3caa537034 931 0 2026-01-23 17:27:27 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:68b765b4fc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4547-1-0-a-d0877fd079 whisker-68b765b4fc-4fz2b eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali5755824dd20 [] [] }} ContainerID="c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" Namespace="calico-system" Pod="whisker-68b765b4fc-4fz2b" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-whisker--68b765b4fc--4fz2b-" Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.132 [INFO][4359] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" Namespace="calico-system" Pod="whisker-68b765b4fc-4fz2b" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-whisker--68b765b4fc--4fz2b-eth0" Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.172 [INFO][4434] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" HandleID="k8s-pod-network.c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" Workload="ci--4547--1--0--a--d0877fd079-k8s-whisker--68b765b4fc--4fz2b-eth0" Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.173 [INFO][4434] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" HandleID="k8s-pod-network.c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" Workload="ci--4547--1--0--a--d0877fd079-k8s-whisker--68b765b4fc--4fz2b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002dd580), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-1-0-a-d0877fd079", "pod":"whisker-68b765b4fc-4fz2b", "timestamp":"2026-01-23 17:27:28.172226698 +0000 UTC"}, Hostname:"ci-4547-1-0-a-d0877fd079", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.173 [INFO][4434] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.173 [INFO][4434] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.173 [INFO][4434] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-a-d0877fd079' Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.188 [INFO][4434] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.194 [INFO][4434] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.201 [INFO][4434] ipam/ipam.go 511: Trying affinity for 192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.204 [INFO][4434] ipam/ipam.go 158: Attempting to load block cidr=192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.208 [INFO][4434] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.208 [INFO][4434] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.16.0/26 handle="k8s-pod-network.c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.211 [INFO][4434] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27 Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.217 [INFO][4434] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.16.0/26 handle="k8s-pod-network.c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.223 [INFO][4434] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.16.2/26] block=192.168.16.0/26 handle="k8s-pod-network.c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.223 [INFO][4434] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.16.2/26] handle="k8s-pod-network.c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.223 [INFO][4434] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 17:27:28.248007 containerd[1675]: 2026-01-23 17:27:28.223 [INFO][4434] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.16.2/26] IPv6=[] ContainerID="c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" HandleID="k8s-pod-network.c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" Workload="ci--4547--1--0--a--d0877fd079-k8s-whisker--68b765b4fc--4fz2b-eth0" Jan 23 17:27:28.248591 containerd[1675]: 2026-01-23 17:27:28.226 [INFO][4359] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" Namespace="calico-system" Pod="whisker-68b765b4fc-4fz2b" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-whisker--68b765b4fc--4fz2b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--a--d0877fd079-k8s-whisker--68b765b4fc--4fz2b-eth0", GenerateName:"whisker-68b765b4fc-", Namespace:"calico-system", SelfLink:"", UID:"348a8202-26ae-4e13-80ce-0b3caa537034", ResourceVersion:"931", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 17, 27, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"68b765b4fc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-a-d0877fd079", ContainerID:"", Pod:"whisker-68b765b4fc-4fz2b", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.16.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali5755824dd20", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 17:27:28.248591 containerd[1675]: 2026-01-23 17:27:28.226 [INFO][4359] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.16.2/32] ContainerID="c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" Namespace="calico-system" Pod="whisker-68b765b4fc-4fz2b" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-whisker--68b765b4fc--4fz2b-eth0" Jan 23 17:27:28.248591 containerd[1675]: 2026-01-23 17:27:28.226 [INFO][4359] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5755824dd20 ContainerID="c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" Namespace="calico-system" Pod="whisker-68b765b4fc-4fz2b" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-whisker--68b765b4fc--4fz2b-eth0" Jan 23 17:27:28.248591 containerd[1675]: 2026-01-23 17:27:28.231 [INFO][4359] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" Namespace="calico-system" Pod="whisker-68b765b4fc-4fz2b" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-whisker--68b765b4fc--4fz2b-eth0" Jan 23 17:27:28.248591 containerd[1675]: 2026-01-23 17:27:28.232 [INFO][4359] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" Namespace="calico-system" Pod="whisker-68b765b4fc-4fz2b" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-whisker--68b765b4fc--4fz2b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--a--d0877fd079-k8s-whisker--68b765b4fc--4fz2b-eth0", GenerateName:"whisker-68b765b4fc-", Namespace:"calico-system", SelfLink:"", UID:"348a8202-26ae-4e13-80ce-0b3caa537034", ResourceVersion:"931", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 17, 27, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"68b765b4fc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-a-d0877fd079", ContainerID:"c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27", Pod:"whisker-68b765b4fc-4fz2b", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.16.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali5755824dd20", MAC:"ce:ed:50:80:b0:43", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 17:27:28.248591 containerd[1675]: 2026-01-23 17:27:28.246 [INFO][4359] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" Namespace="calico-system" Pod="whisker-68b765b4fc-4fz2b" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-whisker--68b765b4fc--4fz2b-eth0" Jan 23 17:27:28.284949 containerd[1675]: time="2026-01-23T17:27:28.284424289Z" level=info msg="connecting to shim c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27" address="unix:///run/containerd/s/89a04e96652908296dd0e2ac7527bbdc5d07f2e080ca92ffd4eb8c584134fed5" namespace=k8s.io protocol=ttrpc version=3 Jan 23 17:27:28.319567 systemd[1]: Started cri-containerd-c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27.scope - libcontainer container c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27. Jan 23 17:27:28.347000 audit: BPF prog-id=185 op=LOAD Jan 23 17:27:28.350000 audit: BPF prog-id=186 op=LOAD Jan 23 17:27:28.350000 audit[4480]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4469 pid=4480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.350000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339666638383161623563653263636331363039343866313437306364 Jan 23 17:27:28.350000 audit: BPF prog-id=186 op=UNLOAD Jan 23 17:27:28.350000 audit[4480]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4469 pid=4480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.350000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339666638383161623563653263636331363039343866313437306364 Jan 23 17:27:28.350000 audit: BPF prog-id=187 op=LOAD Jan 23 17:27:28.350000 audit[4480]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4469 pid=4480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.350000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339666638383161623563653263636331363039343866313437306364 Jan 23 17:27:28.350000 audit: BPF prog-id=188 op=LOAD Jan 23 17:27:28.350000 audit[4480]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4469 pid=4480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.350000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339666638383161623563653263636331363039343866313437306364 Jan 23 17:27:28.350000 audit: BPF prog-id=188 op=UNLOAD Jan 23 17:27:28.350000 audit[4480]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4469 pid=4480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.350000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339666638383161623563653263636331363039343866313437306364 Jan 23 17:27:28.350000 audit: BPF prog-id=187 op=UNLOAD Jan 23 17:27:28.350000 audit[4480]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4469 pid=4480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.350000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339666638383161623563653263636331363039343866313437306364 Jan 23 17:27:28.351000 audit: BPF prog-id=189 op=LOAD Jan 23 17:27:28.351000 audit[4480]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4469 pid=4480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.351000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339666638383161623563653263636331363039343866313437306364 Jan 23 17:27:28.415828 containerd[1675]: time="2026-01-23T17:27:28.415783693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-68b765b4fc-4fz2b,Uid:348a8202-26ae-4e13-80ce-0b3caa537034,Namespace:calico-system,Attempt:0,} returns sandbox id \"c9ff881ab5ce2ccc160948f1470cd874a6cad7daba1c25dd05daf6bfb163db27\"" Jan 23 17:27:28.418119 containerd[1675]: time="2026-01-23T17:27:28.418082704Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 17:27:28.435000 audit: BPF prog-id=190 op=LOAD Jan 23 17:27:28.435000 audit[4537]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd421cf58 a2=98 a3=ffffd421cf48 items=0 ppid=4345 pid=4537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.435000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 17:27:28.435000 audit: BPF prog-id=190 op=UNLOAD Jan 23 17:27:28.435000 audit[4537]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd421cf28 a3=0 items=0 ppid=4345 pid=4537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.435000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 17:27:28.436000 audit: BPF prog-id=191 op=LOAD Jan 23 17:27:28.436000 audit[4537]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd421ce08 a2=74 a3=95 items=0 ppid=4345 pid=4537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.436000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 17:27:28.436000 audit: BPF prog-id=191 op=UNLOAD Jan 23 17:27:28.436000 audit[4537]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4345 pid=4537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.436000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 17:27:28.436000 audit: BPF prog-id=192 op=LOAD Jan 23 17:27:28.436000 audit[4537]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd421ce38 a2=40 a3=ffffd421ce68 items=0 ppid=4345 pid=4537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.436000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 17:27:28.436000 audit: BPF prog-id=192 op=UNLOAD Jan 23 17:27:28.436000 audit[4537]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffd421ce68 items=0 ppid=4345 pid=4537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.436000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 23 17:27:28.437000 audit: BPF prog-id=193 op=LOAD Jan 23 17:27:28.437000 audit[4538]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffcaec68e8 a2=98 a3=ffffcaec68d8 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.437000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.437000 audit: BPF prog-id=193 op=UNLOAD Jan 23 17:27:28.437000 audit[4538]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffcaec68b8 a3=0 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.437000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.437000 audit: BPF prog-id=194 op=LOAD Jan 23 17:27:28.437000 audit[4538]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffcaec6578 a2=74 a3=95 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.437000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.437000 audit: BPF prog-id=194 op=UNLOAD Jan 23 17:27:28.437000 audit[4538]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.437000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.437000 audit: BPF prog-id=195 op=LOAD Jan 23 17:27:28.437000 audit[4538]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffcaec65d8 a2=94 a3=2 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.437000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.437000 audit: BPF prog-id=195 op=UNLOAD Jan 23 17:27:28.437000 audit[4538]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.437000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.538000 audit: BPF prog-id=196 op=LOAD Jan 23 17:27:28.538000 audit[4538]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffcaec6598 a2=40 a3=ffffcaec65c8 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.538000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.538000 audit: BPF prog-id=196 op=UNLOAD Jan 23 17:27:28.538000 audit[4538]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffcaec65c8 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.538000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.548000 audit: BPF prog-id=197 op=LOAD Jan 23 17:27:28.548000 audit[4538]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffcaec65a8 a2=94 a3=4 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.548000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.548000 audit: BPF prog-id=197 op=UNLOAD Jan 23 17:27:28.548000 audit[4538]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.548000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.548000 audit: BPF prog-id=198 op=LOAD Jan 23 17:27:28.548000 audit[4538]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffcaec63e8 a2=94 a3=5 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.548000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.548000 audit: BPF prog-id=198 op=UNLOAD Jan 23 17:27:28.548000 audit[4538]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.548000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.548000 audit: BPF prog-id=199 op=LOAD Jan 23 17:27:28.548000 audit[4538]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffcaec6618 a2=94 a3=6 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.548000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.548000 audit: BPF prog-id=199 op=UNLOAD Jan 23 17:27:28.548000 audit[4538]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.548000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.548000 audit: BPF prog-id=200 op=LOAD Jan 23 17:27:28.548000 audit[4538]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffcaec5de8 a2=94 a3=83 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.548000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.549000 audit: BPF prog-id=201 op=LOAD Jan 23 17:27:28.549000 audit[4538]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffcaec5ba8 a2=94 a3=2 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.549000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.549000 audit: BPF prog-id=201 op=UNLOAD Jan 23 17:27:28.549000 audit[4538]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.549000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.549000 audit: BPF prog-id=200 op=UNLOAD Jan 23 17:27:28.549000 audit[4538]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=246a620 a3=245db00 items=0 ppid=4345 pid=4538 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.549000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 23 17:27:28.558000 audit: BPF prog-id=202 op=LOAD Jan 23 17:27:28.558000 audit[4541]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdd342358 a2=98 a3=ffffdd342348 items=0 ppid=4345 pid=4541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.558000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 17:27:28.558000 audit: BPF prog-id=202 op=UNLOAD Jan 23 17:27:28.558000 audit[4541]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffdd342328 a3=0 items=0 ppid=4345 pid=4541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.558000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 17:27:28.558000 audit: BPF prog-id=203 op=LOAD Jan 23 17:27:28.558000 audit[4541]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdd342208 a2=74 a3=95 items=0 ppid=4345 pid=4541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.558000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 17:27:28.558000 audit: BPF prog-id=203 op=UNLOAD Jan 23 17:27:28.558000 audit[4541]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4345 pid=4541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.558000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 17:27:28.558000 audit: BPF prog-id=204 op=LOAD Jan 23 17:27:28.558000 audit[4541]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdd342238 a2=40 a3=ffffdd342268 items=0 ppid=4345 pid=4541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.558000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 17:27:28.558000 audit: BPF prog-id=204 op=UNLOAD Jan 23 17:27:28.558000 audit[4541]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffdd342268 items=0 ppid=4345 pid=4541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.558000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 23 17:27:28.563537 kubelet[2900]: I0123 17:27:28.563504 2900 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7377c58d-7984-42ea-99c0-3f9fee9f0b42" path="/var/lib/kubelet/pods/7377c58d-7984-42ea-99c0-3f9fee9f0b42/volumes" Jan 23 17:27:28.563939 containerd[1675]: time="2026-01-23T17:27:28.563890700Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5bcfdccd6d-ph9sw,Uid:d8d63139-4f9a-445f-a5a9-c69b14220343,Namespace:calico-system,Attempt:0,}" Jan 23 17:27:28.565970 containerd[1675]: time="2026-01-23T17:27:28.565927110Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-glh59,Uid:5f83fc99-aaad-4d50-822f-b091c1ee0475,Namespace:kube-system,Attempt:0,}" Jan 23 17:27:28.655513 systemd-networkd[1585]: vxlan.calico: Link UP Jan 23 17:27:28.655522 systemd-networkd[1585]: vxlan.calico: Gained carrier Jan 23 17:27:28.688000 audit: BPF prog-id=205 op=LOAD Jan 23 17:27:28.688000 audit[4612]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffda3541f8 a2=98 a3=ffffda3541e8 items=0 ppid=4345 pid=4612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.688000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 17:27:28.688000 audit: BPF prog-id=205 op=UNLOAD Jan 23 17:27:28.688000 audit[4612]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffda3541c8 a3=0 items=0 ppid=4345 pid=4612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.688000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 17:27:28.688000 audit: BPF prog-id=206 op=LOAD Jan 23 17:27:28.688000 audit[4612]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffda353ed8 a2=74 a3=95 items=0 ppid=4345 pid=4612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.688000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 17:27:28.688000 audit: BPF prog-id=206 op=UNLOAD Jan 23 17:27:28.688000 audit[4612]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4345 pid=4612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.688000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 17:27:28.688000 audit: BPF prog-id=207 op=LOAD Jan 23 17:27:28.688000 audit[4612]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffda353f38 a2=94 a3=2 items=0 ppid=4345 pid=4612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.688000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 17:27:28.688000 audit: BPF prog-id=207 op=UNLOAD Jan 23 17:27:28.688000 audit[4612]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=70 a3=2 items=0 ppid=4345 pid=4612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.688000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 17:27:28.688000 audit: BPF prog-id=208 op=LOAD Jan 23 17:27:28.688000 audit[4612]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffda353db8 a2=40 a3=ffffda353de8 items=0 ppid=4345 pid=4612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.688000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 17:27:28.688000 audit: BPF prog-id=208 op=UNLOAD Jan 23 17:27:28.688000 audit[4612]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=40 a3=ffffda353de8 items=0 ppid=4345 pid=4612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.688000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 17:27:28.688000 audit: BPF prog-id=209 op=LOAD Jan 23 17:27:28.688000 audit[4612]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffda353f08 a2=94 a3=b7 items=0 ppid=4345 pid=4612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.688000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 17:27:28.688000 audit: BPF prog-id=209 op=UNLOAD Jan 23 17:27:28.688000 audit[4612]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=b7 items=0 ppid=4345 pid=4612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.688000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 17:27:28.691000 audit: BPF prog-id=210 op=LOAD Jan 23 17:27:28.691000 audit[4612]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffda3535b8 a2=94 a3=2 items=0 ppid=4345 pid=4612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.691000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 17:27:28.691000 audit: BPF prog-id=210 op=UNLOAD Jan 23 17:27:28.691000 audit[4612]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=2 items=0 ppid=4345 pid=4612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.691000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 17:27:28.691000 audit: BPF prog-id=211 op=LOAD Jan 23 17:27:28.691000 audit[4612]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffda353748 a2=94 a3=30 items=0 ppid=4345 pid=4612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.691000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 23 17:27:28.700000 audit: BPF prog-id=212 op=LOAD Jan 23 17:27:28.700000 audit[4618]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdd626f58 a2=98 a3=ffffdd626f48 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.700000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.701000 audit: BPF prog-id=212 op=UNLOAD Jan 23 17:27:28.701000 audit[4618]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffdd626f28 a3=0 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.701000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.702000 audit: BPF prog-id=213 op=LOAD Jan 23 17:27:28.702000 audit[4618]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffdd626be8 a2=74 a3=95 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.702000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.702000 audit: BPF prog-id=213 op=UNLOAD Jan 23 17:27:28.702000 audit[4618]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.702000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.702000 audit: BPF prog-id=214 op=LOAD Jan 23 17:27:28.702000 audit[4618]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffdd626c48 a2=94 a3=2 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.702000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.704000 audit: BPF prog-id=214 op=UNLOAD Jan 23 17:27:28.704000 audit[4618]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.704000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.720609 kubelet[2900]: I0123 17:27:28.720546 2900 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-sgfzg" podStartSLOduration=40.720530188 podStartE2EDuration="40.720530188s" podCreationTimestamp="2026-01-23 17:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 17:27:28.720125666 +0000 UTC m=+46.255669358" watchObservedRunningTime="2026-01-23 17:27:28.720530188 +0000 UTC m=+46.256073840" Jan 23 17:27:28.734000 audit[4626]: NETFILTER_CFG table=filter:119 family=2 entries=20 op=nft_register_rule pid=4626 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:28.734000 audit[4626]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffec7b2100 a2=0 a3=1 items=0 ppid=3013 pid=4626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.734000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:28.741000 audit[4626]: NETFILTER_CFG table=nat:120 family=2 entries=14 op=nft_register_rule pid=4626 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:28.741000 audit[4626]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffec7b2100 a2=0 a3=1 items=0 ppid=3013 pid=4626 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.741000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:28.755024 containerd[1675]: time="2026-01-23T17:27:28.754880397Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:27:28.755650 systemd-networkd[1585]: cali405ca985158: Link UP Jan 23 17:27:28.756847 containerd[1675]: time="2026-01-23T17:27:28.756674965Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 17:27:28.756847 containerd[1675]: time="2026-01-23T17:27:28.756769326Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 17:27:28.756721 systemd-networkd[1585]: cali405ca985158: Gained carrier Jan 23 17:27:28.757770 kubelet[2900]: E0123 17:27:28.757732 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 17:27:28.757922 kubelet[2900]: E0123 17:27:28.757865 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 17:27:28.758041 kubelet[2900]: E0123 17:27:28.758013 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-68b765b4fc-4fz2b_calico-system(348a8202-26ae-4e13-80ce-0b3caa537034): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 17:27:28.759138 containerd[1675]: time="2026-01-23T17:27:28.759100097Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.628 [INFO][4553] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--a--d0877fd079-k8s-calico--kube--controllers--5bcfdccd6d--ph9sw-eth0 calico-kube-controllers-5bcfdccd6d- calico-system d8d63139-4f9a-445f-a5a9-c69b14220343 834 0 2026-01-23 17:27:05 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5bcfdccd6d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4547-1-0-a-d0877fd079 calico-kube-controllers-5bcfdccd6d-ph9sw eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali405ca985158 [] [] }} ContainerID="547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" Namespace="calico-system" Pod="calico-kube-controllers-5bcfdccd6d-ph9sw" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--kube--controllers--5bcfdccd6d--ph9sw-" Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.629 [INFO][4553] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" Namespace="calico-system" Pod="calico-kube-controllers-5bcfdccd6d-ph9sw" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--kube--controllers--5bcfdccd6d--ph9sw-eth0" Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.673 [INFO][4587] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" HandleID="k8s-pod-network.547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" Workload="ci--4547--1--0--a--d0877fd079-k8s-calico--kube--controllers--5bcfdccd6d--ph9sw-eth0" Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.673 [INFO][4587] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" HandleID="k8s-pod-network.547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" Workload="ci--4547--1--0--a--d0877fd079-k8s-calico--kube--controllers--5bcfdccd6d--ph9sw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000510a80), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-1-0-a-d0877fd079", "pod":"calico-kube-controllers-5bcfdccd6d-ph9sw", "timestamp":"2026-01-23 17:27:28.673416517 +0000 UTC"}, Hostname:"ci-4547-1-0-a-d0877fd079", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.673 [INFO][4587] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.673 [INFO][4587] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.673 [INFO][4587] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-a-d0877fd079' Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.686 [INFO][4587] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.697 [INFO][4587] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.719 [INFO][4587] ipam/ipam.go 511: Trying affinity for 192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.725 [INFO][4587] ipam/ipam.go 158: Attempting to load block cidr=192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.727 [INFO][4587] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.728 [INFO][4587] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.16.0/26 handle="k8s-pod-network.547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.730 [INFO][4587] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.736 [INFO][4587] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.16.0/26 handle="k8s-pod-network.547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.745 [INFO][4587] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.16.3/26] block=192.168.16.0/26 handle="k8s-pod-network.547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.746 [INFO][4587] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.16.3/26] handle="k8s-pod-network.547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.746 [INFO][4587] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 17:27:28.774423 containerd[1675]: 2026-01-23 17:27:28.746 [INFO][4587] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.16.3/26] IPv6=[] ContainerID="547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" HandleID="k8s-pod-network.547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" Workload="ci--4547--1--0--a--d0877fd079-k8s-calico--kube--controllers--5bcfdccd6d--ph9sw-eth0" Jan 23 17:27:28.774914 containerd[1675]: 2026-01-23 17:27:28.749 [INFO][4553] cni-plugin/k8s.go 418: Populated endpoint ContainerID="547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" Namespace="calico-system" Pod="calico-kube-controllers-5bcfdccd6d-ph9sw" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--kube--controllers--5bcfdccd6d--ph9sw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--a--d0877fd079-k8s-calico--kube--controllers--5bcfdccd6d--ph9sw-eth0", GenerateName:"calico-kube-controllers-5bcfdccd6d-", Namespace:"calico-system", SelfLink:"", UID:"d8d63139-4f9a-445f-a5a9-c69b14220343", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 17, 27, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5bcfdccd6d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-a-d0877fd079", ContainerID:"", Pod:"calico-kube-controllers-5bcfdccd6d-ph9sw", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.16.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali405ca985158", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 17:27:28.774914 containerd[1675]: 2026-01-23 17:27:28.749 [INFO][4553] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.16.3/32] ContainerID="547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" Namespace="calico-system" Pod="calico-kube-controllers-5bcfdccd6d-ph9sw" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--kube--controllers--5bcfdccd6d--ph9sw-eth0" Jan 23 17:27:28.774914 containerd[1675]: 2026-01-23 17:27:28.750 [INFO][4553] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali405ca985158 ContainerID="547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" Namespace="calico-system" Pod="calico-kube-controllers-5bcfdccd6d-ph9sw" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--kube--controllers--5bcfdccd6d--ph9sw-eth0" Jan 23 17:27:28.774914 containerd[1675]: 2026-01-23 17:27:28.757 [INFO][4553] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" Namespace="calico-system" Pod="calico-kube-controllers-5bcfdccd6d-ph9sw" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--kube--controllers--5bcfdccd6d--ph9sw-eth0" Jan 23 17:27:28.774914 containerd[1675]: 2026-01-23 17:27:28.758 [INFO][4553] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" Namespace="calico-system" Pod="calico-kube-controllers-5bcfdccd6d-ph9sw" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--kube--controllers--5bcfdccd6d--ph9sw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--a--d0877fd079-k8s-calico--kube--controllers--5bcfdccd6d--ph9sw-eth0", GenerateName:"calico-kube-controllers-5bcfdccd6d-", Namespace:"calico-system", SelfLink:"", UID:"d8d63139-4f9a-445f-a5a9-c69b14220343", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 17, 27, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5bcfdccd6d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-a-d0877fd079", ContainerID:"547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d", Pod:"calico-kube-controllers-5bcfdccd6d-ph9sw", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.16.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali405ca985158", MAC:"ee:1a:46:90:30:1b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 17:27:28.774914 containerd[1675]: 2026-01-23 17:27:28.771 [INFO][4553] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" Namespace="calico-system" Pod="calico-kube-controllers-5bcfdccd6d-ph9sw" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--kube--controllers--5bcfdccd6d--ph9sw-eth0" Jan 23 17:27:28.803889 containerd[1675]: time="2026-01-23T17:27:28.803836717Z" level=info msg="connecting to shim 547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d" address="unix:///run/containerd/s/87bc4362bf94f30c684d2a461a84f35245d0bf1ce23454f90adad8968c146b2e" namespace=k8s.io protocol=ttrpc version=3 Jan 23 17:27:28.836550 systemd[1]: Started cri-containerd-547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d.scope - libcontainer container 547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d. Jan 23 17:27:28.839718 systemd-networkd[1585]: cali033d472ee10: Link UP Jan 23 17:27:28.840650 systemd-networkd[1585]: cali033d472ee10: Gained carrier Jan 23 17:27:28.849000 audit: BPF prog-id=215 op=LOAD Jan 23 17:27:28.852000 audit: BPF prog-id=216 op=LOAD Jan 23 17:27:28.852000 audit[4654]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=4641 pid=4654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.852000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534376563633365643565646266643763396463336461303662666138 Jan 23 17:27:28.852000 audit: BPF prog-id=216 op=UNLOAD Jan 23 17:27:28.852000 audit[4654]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4641 pid=4654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.852000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534376563633365643565646266643763396463336461303662666138 Jan 23 17:27:28.853000 audit: BPF prog-id=217 op=LOAD Jan 23 17:27:28.853000 audit[4654]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=4641 pid=4654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.853000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534376563633365643565646266643763396463336461303662666138 Jan 23 17:27:28.853000 audit: BPF prog-id=218 op=LOAD Jan 23 17:27:28.853000 audit[4654]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=4641 pid=4654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.853000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534376563633365643565646266643763396463336461303662666138 Jan 23 17:27:28.854000 audit: BPF prog-id=218 op=UNLOAD Jan 23 17:27:28.854000 audit[4654]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4641 pid=4654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.854000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534376563633365643565646266643763396463336461303662666138 Jan 23 17:27:28.854000 audit: BPF prog-id=217 op=UNLOAD Jan 23 17:27:28.854000 audit[4654]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4641 pid=4654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.854000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534376563633365643565646266643763396463336461303662666138 Jan 23 17:27:28.854000 audit: BPF prog-id=219 op=LOAD Jan 23 17:27:28.854000 audit[4654]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=4641 pid=4654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.854000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534376563633365643565646266643763396463336461303662666138 Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.640 [INFO][4554] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--glh59-eth0 coredns-66bc5c9577- kube-system 5f83fc99-aaad-4d50-822f-b091c1ee0475 837 0 2026-01-23 17:26:48 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-1-0-a-d0877fd079 coredns-66bc5c9577-glh59 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali033d472ee10 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" Namespace="kube-system" Pod="coredns-66bc5c9577-glh59" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--glh59-" Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.641 [INFO][4554] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" Namespace="kube-system" Pod="coredns-66bc5c9577-glh59" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--glh59-eth0" Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.710 [INFO][4593] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" HandleID="k8s-pod-network.2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" Workload="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--glh59-eth0" Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.710 [INFO][4593] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" HandleID="k8s-pod-network.2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" Workload="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--glh59-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000397de0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-1-0-a-d0877fd079", "pod":"coredns-66bc5c9577-glh59", "timestamp":"2026-01-23 17:27:28.710074297 +0000 UTC"}, Hostname:"ci-4547-1-0-a-d0877fd079", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.710 [INFO][4593] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.746 [INFO][4593] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.746 [INFO][4593] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-a-d0877fd079' Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.787 [INFO][4593] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.797 [INFO][4593] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.811 [INFO][4593] ipam/ipam.go 511: Trying affinity for 192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.816 [INFO][4593] ipam/ipam.go 158: Attempting to load block cidr=192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.818 [INFO][4593] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.818 [INFO][4593] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.16.0/26 handle="k8s-pod-network.2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.820 [INFO][4593] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973 Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.826 [INFO][4593] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.16.0/26 handle="k8s-pod-network.2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.833 [INFO][4593] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.16.4/26] block=192.168.16.0/26 handle="k8s-pod-network.2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.833 [INFO][4593] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.16.4/26] handle="k8s-pod-network.2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.834 [INFO][4593] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 17:27:28.861124 containerd[1675]: 2026-01-23 17:27:28.834 [INFO][4593] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.16.4/26] IPv6=[] ContainerID="2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" HandleID="k8s-pod-network.2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" Workload="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--glh59-eth0" Jan 23 17:27:28.861895 containerd[1675]: 2026-01-23 17:27:28.836 [INFO][4554] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" Namespace="kube-system" Pod="coredns-66bc5c9577-glh59" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--glh59-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--glh59-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"5f83fc99-aaad-4d50-822f-b091c1ee0475", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 17, 26, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-a-d0877fd079", ContainerID:"", Pod:"coredns-66bc5c9577-glh59", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.16.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali033d472ee10", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 17:27:28.861895 containerd[1675]: 2026-01-23 17:27:28.836 [INFO][4554] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.16.4/32] ContainerID="2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" Namespace="kube-system" Pod="coredns-66bc5c9577-glh59" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--glh59-eth0" Jan 23 17:27:28.861895 containerd[1675]: 2026-01-23 17:27:28.836 [INFO][4554] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali033d472ee10 ContainerID="2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" Namespace="kube-system" Pod="coredns-66bc5c9577-glh59" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--glh59-eth0" Jan 23 17:27:28.861895 containerd[1675]: 2026-01-23 17:27:28.841 [INFO][4554] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" Namespace="kube-system" Pod="coredns-66bc5c9577-glh59" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--glh59-eth0" Jan 23 17:27:28.861895 containerd[1675]: 2026-01-23 17:27:28.842 [INFO][4554] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" Namespace="kube-system" Pod="coredns-66bc5c9577-glh59" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--glh59-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--glh59-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"5f83fc99-aaad-4d50-822f-b091c1ee0475", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 17, 26, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-a-d0877fd079", ContainerID:"2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973", Pod:"coredns-66bc5c9577-glh59", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.16.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali033d472ee10", MAC:"f6:da:a3:f7:94:9f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 17:27:28.862155 containerd[1675]: 2026-01-23 17:27:28.858 [INFO][4554] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" Namespace="kube-system" Pod="coredns-66bc5c9577-glh59" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-coredns--66bc5c9577--glh59-eth0" Jan 23 17:27:28.866000 audit: BPF prog-id=220 op=LOAD Jan 23 17:27:28.866000 audit[4618]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffdd626c08 a2=40 a3=ffffdd626c38 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.866000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.866000 audit: BPF prog-id=220 op=UNLOAD Jan 23 17:27:28.866000 audit[4618]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffdd626c38 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.866000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.876000 audit: BPF prog-id=221 op=LOAD Jan 23 17:27:28.876000 audit[4618]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffdd626c18 a2=94 a3=4 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.876000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.876000 audit: BPF prog-id=221 op=UNLOAD Jan 23 17:27:28.876000 audit[4618]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.876000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.876000 audit: BPF prog-id=222 op=LOAD Jan 23 17:27:28.876000 audit[4618]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffdd626a58 a2=94 a3=5 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.876000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.876000 audit: BPF prog-id=222 op=UNLOAD Jan 23 17:27:28.876000 audit[4618]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.876000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.876000 audit: BPF prog-id=223 op=LOAD Jan 23 17:27:28.876000 audit[4618]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffdd626c88 a2=94 a3=6 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.876000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.876000 audit: BPF prog-id=223 op=UNLOAD Jan 23 17:27:28.876000 audit[4618]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.876000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.876000 audit: BPF prog-id=224 op=LOAD Jan 23 17:27:28.876000 audit[4618]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffdd626458 a2=94 a3=83 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.876000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.877000 audit: BPF prog-id=225 op=LOAD Jan 23 17:27:28.877000 audit[4618]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffdd626218 a2=94 a3=2 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.877000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.877000 audit: BPF prog-id=225 op=UNLOAD Jan 23 17:27:28.877000 audit[4618]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.877000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.877000 audit: BPF prog-id=224 op=UNLOAD Jan 23 17:27:28.877000 audit[4618]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=5ab7620 a3=5aaab00 items=0 ppid=4345 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.877000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 23 17:27:28.880936 containerd[1675]: time="2026-01-23T17:27:28.880833935Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5bcfdccd6d-ph9sw,Uid:d8d63139-4f9a-445f-a5a9-c69b14220343,Namespace:calico-system,Attempt:0,} returns sandbox id \"547ecc3ed5edbfd7c9dc3da06bfa8eaf1c348dea2ced2c777f3c6da0cf39756d\"" Jan 23 17:27:28.883000 audit: BPF prog-id=211 op=UNLOAD Jan 23 17:27:28.883000 audit[4345]: SYSCALL arch=c00000b7 syscall=35 success=yes exit=0 a0=ffffffffffffff9c a1=4000f36200 a2=0 a3=0 items=0 ppid=4335 pid=4345 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.883000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 23 17:27:28.898389 containerd[1675]: time="2026-01-23T17:27:28.897917738Z" level=info msg="connecting to shim 2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973" address="unix:///run/containerd/s/be2ef299eb22b2285f2ab7e2fe57c1700819d991716daacc94997e1723b809b7" namespace=k8s.io protocol=ttrpc version=3 Jan 23 17:27:28.928695 systemd[1]: Started cri-containerd-2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973.scope - libcontainer container 2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973. Jan 23 17:27:28.941000 audit: BPF prog-id=226 op=LOAD Jan 23 17:27:28.941000 audit: BPF prog-id=227 op=LOAD Jan 23 17:27:28.941000 audit[4712]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4698 pid=4712 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.941000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265313933386138623639623433363834303939323931613034366462 Jan 23 17:27:28.941000 audit: BPF prog-id=227 op=UNLOAD Jan 23 17:27:28.941000 audit[4712]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4698 pid=4712 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.941000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265313933386138623639623433363834303939323931613034366462 Jan 23 17:27:28.942000 audit: BPF prog-id=228 op=LOAD Jan 23 17:27:28.942000 audit[4712]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4698 pid=4712 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.942000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265313933386138623639623433363834303939323931613034366462 Jan 23 17:27:28.942000 audit: BPF prog-id=229 op=LOAD Jan 23 17:27:28.942000 audit[4712]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4698 pid=4712 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.942000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265313933386138623639623433363834303939323931613034366462 Jan 23 17:27:28.942000 audit: BPF prog-id=229 op=UNLOAD Jan 23 17:27:28.942000 audit[4712]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4698 pid=4712 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.942000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265313933386138623639623433363834303939323931613034366462 Jan 23 17:27:28.943000 audit: BPF prog-id=228 op=UNLOAD Jan 23 17:27:28.943000 audit[4712]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4698 pid=4712 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.943000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265313933386138623639623433363834303939323931613034366462 Jan 23 17:27:28.943000 audit: BPF prog-id=230 op=LOAD Jan 23 17:27:28.943000 audit[4712]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4698 pid=4712 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.943000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265313933386138623639623433363834303939323931613034366462 Jan 23 17:27:28.944000 audit[4749]: NETFILTER_CFG table=nat:121 family=2 entries=15 op=nft_register_chain pid=4749 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 17:27:28.944000 audit[4749]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=fffff0a9e420 a2=0 a3=ffff833f7fa8 items=0 ppid=4345 pid=4749 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.944000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 17:27:28.948000 audit[4750]: NETFILTER_CFG table=mangle:122 family=2 entries=16 op=nft_register_chain pid=4750 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 17:27:28.948000 audit[4750]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=ffffe9f2c0c0 a2=0 a3=ffff849b7fa8 items=0 ppid=4345 pid=4750 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.948000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 17:27:28.959000 audit[4747]: NETFILTER_CFG table=raw:123 family=2 entries=21 op=nft_register_chain pid=4747 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 17:27:28.959000 audit[4747]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=ffffe9092d20 a2=0 a3=ffffb2c33fa8 items=0 ppid=4345 pid=4747 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.959000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 17:27:28.972437 containerd[1675]: time="2026-01-23T17:27:28.972393544Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-glh59,Uid:5f83fc99-aaad-4d50-822f-b091c1ee0475,Namespace:kube-system,Attempt:0,} returns sandbox id \"2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973\"" Jan 23 17:27:28.977920 containerd[1675]: time="2026-01-23T17:27:28.977879251Z" level=info msg="CreateContainer within sandbox \"2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 23 17:27:28.963000 audit[4752]: NETFILTER_CFG table=filter:124 family=2 entries=128 op=nft_register_chain pid=4752 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 17:27:28.963000 audit[4752]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=72768 a0=3 a1=ffffdf18c610 a2=0 a3=ffffa4dc7fa8 items=0 ppid=4345 pid=4752 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:28.963000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 17:27:28.993338 containerd[1675]: time="2026-01-23T17:27:28.993136405Z" level=info msg="Container e3ea408937adb7256cf4438f5e1e7141296493463bcd6bec42802328f6261f13: CDI devices from CRI Config.CDIDevices: []" Jan 23 17:27:29.001184 containerd[1675]: time="2026-01-23T17:27:29.001120685Z" level=info msg="CreateContainer within sandbox \"2e1938a8b69b43684099291a046db1dd6db34fd5f7b344f78892e09e11c7c973\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e3ea408937adb7256cf4438f5e1e7141296493463bcd6bec42802328f6261f13\"" Jan 23 17:27:29.001827 containerd[1675]: time="2026-01-23T17:27:29.001798128Z" level=info msg="StartContainer for \"e3ea408937adb7256cf4438f5e1e7141296493463bcd6bec42802328f6261f13\"" Jan 23 17:27:29.004478 containerd[1675]: time="2026-01-23T17:27:29.004316700Z" level=info msg="connecting to shim e3ea408937adb7256cf4438f5e1e7141296493463bcd6bec42802328f6261f13" address="unix:///run/containerd/s/be2ef299eb22b2285f2ab7e2fe57c1700819d991716daacc94997e1723b809b7" protocol=ttrpc version=3 Jan 23 17:27:29.006000 audit[4769]: NETFILTER_CFG table=filter:125 family=2 entries=64 op=nft_register_chain pid=4769 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 17:27:29.006000 audit[4769]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=35832 a0=3 a1=ffffd77d5940 a2=0 a3=ffff8c47efa8 items=0 ppid=4345 pid=4769 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.006000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 17:27:29.032528 systemd[1]: Started cri-containerd-e3ea408937adb7256cf4438f5e1e7141296493463bcd6bec42802328f6261f13.scope - libcontainer container e3ea408937adb7256cf4438f5e1e7141296493463bcd6bec42802328f6261f13. Jan 23 17:27:29.043000 audit: BPF prog-id=231 op=LOAD Jan 23 17:27:29.043000 audit: BPF prog-id=232 op=LOAD Jan 23 17:27:29.043000 audit[4770]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4698 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.043000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533656134303839333761646237323536636634343338663565316537 Jan 23 17:27:29.044000 audit: BPF prog-id=232 op=UNLOAD Jan 23 17:27:29.044000 audit[4770]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4698 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.044000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533656134303839333761646237323536636634343338663565316537 Jan 23 17:27:29.044000 audit: BPF prog-id=233 op=LOAD Jan 23 17:27:29.044000 audit[4770]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4698 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.044000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533656134303839333761646237323536636634343338663565316537 Jan 23 17:27:29.044000 audit: BPF prog-id=234 op=LOAD Jan 23 17:27:29.044000 audit[4770]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4698 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.044000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533656134303839333761646237323536636634343338663565316537 Jan 23 17:27:29.044000 audit: BPF prog-id=234 op=UNLOAD Jan 23 17:27:29.044000 audit[4770]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4698 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.044000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533656134303839333761646237323536636634343338663565316537 Jan 23 17:27:29.044000 audit: BPF prog-id=233 op=UNLOAD Jan 23 17:27:29.044000 audit[4770]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4698 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.044000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533656134303839333761646237323536636634343338663565316537 Jan 23 17:27:29.044000 audit: BPF prog-id=235 op=LOAD Jan 23 17:27:29.044000 audit[4770]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4698 pid=4770 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.044000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533656134303839333761646237323536636634343338663565316537 Jan 23 17:27:29.060297 containerd[1675]: time="2026-01-23T17:27:29.060260135Z" level=info msg="StartContainer for \"e3ea408937adb7256cf4438f5e1e7141296493463bcd6bec42802328f6261f13\" returns successfully" Jan 23 17:27:29.062441 systemd-networkd[1585]: calif2ac5e9c25b: Gained IPv6LL Jan 23 17:27:29.108249 containerd[1675]: time="2026-01-23T17:27:29.108207010Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:27:29.109377 containerd[1675]: time="2026-01-23T17:27:29.109296935Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 17:27:29.109493 containerd[1675]: time="2026-01-23T17:27:29.109421456Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 17:27:29.109621 kubelet[2900]: E0123 17:27:29.109583 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 17:27:29.109994 kubelet[2900]: E0123 17:27:29.109630 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 17:27:29.109994 kubelet[2900]: E0123 17:27:29.109804 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-68b765b4fc-4fz2b_calico-system(348a8202-26ae-4e13-80ce-0b3caa537034): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 17:27:29.109994 kubelet[2900]: E0123 17:27:29.109839 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68b765b4fc-4fz2b" podUID="348a8202-26ae-4e13-80ce-0b3caa537034" Jan 23 17:27:29.110608 containerd[1675]: time="2026-01-23T17:27:29.110585422Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 17:27:29.563475 containerd[1675]: time="2026-01-23T17:27:29.563424043Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p68bw,Uid:baa2fc5a-f231-4d23-992e-37e27c865a7c,Namespace:calico-system,Attempt:0,}" Jan 23 17:27:29.637450 systemd-networkd[1585]: cali5755824dd20: Gained IPv6LL Jan 23 17:27:29.639541 containerd[1675]: time="2026-01-23T17:27:29.639502136Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:27:29.641128 containerd[1675]: time="2026-01-23T17:27:29.641088904Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 17:27:29.641211 containerd[1675]: time="2026-01-23T17:27:29.641150344Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 17:27:29.641390 kubelet[2900]: E0123 17:27:29.641346 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 17:27:29.641440 kubelet[2900]: E0123 17:27:29.641398 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 17:27:29.641486 kubelet[2900]: E0123 17:27:29.641464 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5bcfdccd6d-ph9sw_calico-system(d8d63139-4f9a-445f-a5a9-c69b14220343): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 17:27:29.641532 kubelet[2900]: E0123 17:27:29.641510 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:27:29.666414 systemd-networkd[1585]: cali1551446b57b: Link UP Jan 23 17:27:29.666796 systemd-networkd[1585]: cali1551446b57b: Gained carrier Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.601 [INFO][4807] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--a--d0877fd079-k8s-csi--node--driver--p68bw-eth0 csi-node-driver- calico-system baa2fc5a-f231-4d23-992e-37e27c865a7c 746 0 2026-01-23 17:27:05 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4547-1-0-a-d0877fd079 csi-node-driver-p68bw eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali1551446b57b [] [] }} ContainerID="930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" Namespace="calico-system" Pod="csi-node-driver-p68bw" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-csi--node--driver--p68bw-" Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.601 [INFO][4807] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" Namespace="calico-system" Pod="csi-node-driver-p68bw" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-csi--node--driver--p68bw-eth0" Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.623 [INFO][4820] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" HandleID="k8s-pod-network.930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" Workload="ci--4547--1--0--a--d0877fd079-k8s-csi--node--driver--p68bw-eth0" Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.623 [INFO][4820] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" HandleID="k8s-pod-network.930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" Workload="ci--4547--1--0--a--d0877fd079-k8s-csi--node--driver--p68bw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002aa100), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-1-0-a-d0877fd079", "pod":"csi-node-driver-p68bw", "timestamp":"2026-01-23 17:27:29.623291537 +0000 UTC"}, Hostname:"ci-4547-1-0-a-d0877fd079", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.623 [INFO][4820] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.623 [INFO][4820] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.623 [INFO][4820] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-a-d0877fd079' Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.633 [INFO][4820] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.638 [INFO][4820] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.644 [INFO][4820] ipam/ipam.go 511: Trying affinity for 192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.646 [INFO][4820] ipam/ipam.go 158: Attempting to load block cidr=192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.648 [INFO][4820] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.648 [INFO][4820] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.16.0/26 handle="k8s-pod-network.930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.651 [INFO][4820] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92 Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.655 [INFO][4820] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.16.0/26 handle="k8s-pod-network.930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.662 [INFO][4820] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.16.5/26] block=192.168.16.0/26 handle="k8s-pod-network.930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.662 [INFO][4820] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.16.5/26] handle="k8s-pod-network.930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.662 [INFO][4820] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 17:27:29.679818 containerd[1675]: 2026-01-23 17:27:29.662 [INFO][4820] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.16.5/26] IPv6=[] ContainerID="930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" HandleID="k8s-pod-network.930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" Workload="ci--4547--1--0--a--d0877fd079-k8s-csi--node--driver--p68bw-eth0" Jan 23 17:27:29.680295 containerd[1675]: 2026-01-23 17:27:29.665 [INFO][4807] cni-plugin/k8s.go 418: Populated endpoint ContainerID="930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" Namespace="calico-system" Pod="csi-node-driver-p68bw" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-csi--node--driver--p68bw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--a--d0877fd079-k8s-csi--node--driver--p68bw-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"baa2fc5a-f231-4d23-992e-37e27c865a7c", ResourceVersion:"746", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 17, 27, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-a-d0877fd079", ContainerID:"", Pod:"csi-node-driver-p68bw", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.16.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali1551446b57b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 17:27:29.680295 containerd[1675]: 2026-01-23 17:27:29.665 [INFO][4807] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.16.5/32] ContainerID="930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" Namespace="calico-system" Pod="csi-node-driver-p68bw" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-csi--node--driver--p68bw-eth0" Jan 23 17:27:29.680295 containerd[1675]: 2026-01-23 17:27:29.665 [INFO][4807] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1551446b57b ContainerID="930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" Namespace="calico-system" Pod="csi-node-driver-p68bw" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-csi--node--driver--p68bw-eth0" Jan 23 17:27:29.680295 containerd[1675]: 2026-01-23 17:27:29.666 [INFO][4807] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" Namespace="calico-system" Pod="csi-node-driver-p68bw" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-csi--node--driver--p68bw-eth0" Jan 23 17:27:29.680295 containerd[1675]: 2026-01-23 17:27:29.667 [INFO][4807] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" Namespace="calico-system" Pod="csi-node-driver-p68bw" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-csi--node--driver--p68bw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--a--d0877fd079-k8s-csi--node--driver--p68bw-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"baa2fc5a-f231-4d23-992e-37e27c865a7c", ResourceVersion:"746", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 17, 27, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-a-d0877fd079", ContainerID:"930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92", Pod:"csi-node-driver-p68bw", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.16.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali1551446b57b", MAC:"ce:7e:79:cf:07:da", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 17:27:29.680295 containerd[1675]: 2026-01-23 17:27:29.677 [INFO][4807] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" Namespace="calico-system" Pod="csi-node-driver-p68bw" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-csi--node--driver--p68bw-eth0" Jan 23 17:27:29.692000 audit[4837]: NETFILTER_CFG table=filter:126 family=2 entries=48 op=nft_register_chain pid=4837 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 17:27:29.692000 audit[4837]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=23140 a0=3 a1=ffffd35f3c10 a2=0 a3=ffff89b9cfa8 items=0 ppid=4345 pid=4837 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.692000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 17:27:29.706836 containerd[1675]: time="2026-01-23T17:27:29.706482265Z" level=info msg="connecting to shim 930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92" address="unix:///run/containerd/s/bcf224973afb2450e73d44957fc045fc0f9ef65e335dda4c12f4f16f8ca5eb40" namespace=k8s.io protocol=ttrpc version=3 Jan 23 17:27:29.724770 kubelet[2900]: E0123 17:27:29.724725 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:27:29.726475 kubelet[2900]: E0123 17:27:29.726416 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68b765b4fc-4fz2b" podUID="348a8202-26ae-4e13-80ce-0b3caa537034" Jan 23 17:27:29.733032 kubelet[2900]: I0123 17:27:29.732972 2900 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-glh59" podStartSLOduration=41.732889994 podStartE2EDuration="41.732889994s" podCreationTimestamp="2026-01-23 17:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-23 17:27:29.732875874 +0000 UTC m=+47.268419566" watchObservedRunningTime="2026-01-23 17:27:29.732889994 +0000 UTC m=+47.268433686" Jan 23 17:27:29.740537 systemd[1]: Started cri-containerd-930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92.scope - libcontainer container 930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92. Jan 23 17:27:29.755000 audit: BPF prog-id=236 op=LOAD Jan 23 17:27:29.756000 audit: BPF prog-id=237 op=LOAD Jan 23 17:27:29.756000 audit[4859]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4847 pid=4859 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.756000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3933306162353566636161623735616165343662363237636239376362 Jan 23 17:27:29.756000 audit: BPF prog-id=237 op=UNLOAD Jan 23 17:27:29.756000 audit[4859]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4847 pid=4859 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.756000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3933306162353566636161623735616165343662363237636239376362 Jan 23 17:27:29.756000 audit: BPF prog-id=238 op=LOAD Jan 23 17:27:29.756000 audit[4859]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4847 pid=4859 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.756000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3933306162353566636161623735616165343662363237636239376362 Jan 23 17:27:29.756000 audit: BPF prog-id=239 op=LOAD Jan 23 17:27:29.756000 audit[4859]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4847 pid=4859 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.756000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3933306162353566636161623735616165343662363237636239376362 Jan 23 17:27:29.756000 audit: BPF prog-id=239 op=UNLOAD Jan 23 17:27:29.756000 audit[4859]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4847 pid=4859 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.756000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3933306162353566636161623735616165343662363237636239376362 Jan 23 17:27:29.756000 audit: BPF prog-id=238 op=UNLOAD Jan 23 17:27:29.756000 audit[4859]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4847 pid=4859 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.756000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3933306162353566636161623735616165343662363237636239376362 Jan 23 17:27:29.756000 audit: BPF prog-id=240 op=LOAD Jan 23 17:27:29.756000 audit[4859]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4847 pid=4859 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.756000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3933306162353566636161623735616165343662363237636239376362 Jan 23 17:27:29.767000 audit[4879]: NETFILTER_CFG table=filter:127 family=2 entries=17 op=nft_register_rule pid=4879 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:29.767000 audit[4879]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffec882660 a2=0 a3=1 items=0 ppid=3013 pid=4879 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.767000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:29.782000 audit[4879]: NETFILTER_CFG table=nat:128 family=2 entries=35 op=nft_register_chain pid=4879 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:29.782000 audit[4879]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=ffffec882660 a2=0 a3=1 items=0 ppid=3013 pid=4879 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:29.782000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:29.791369 containerd[1675]: time="2026-01-23T17:27:29.791233121Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p68bw,Uid:baa2fc5a-f231-4d23-992e-37e27c865a7c,Namespace:calico-system,Attempt:0,} returns sandbox id \"930ab55fcaab75aae46b627cb97cbb950124bba1887b0437556d659c8054bf92\"" Jan 23 17:27:29.793939 containerd[1675]: time="2026-01-23T17:27:29.793912894Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 17:27:29.958182 systemd-networkd[1585]: cali033d472ee10: Gained IPv6LL Jan 23 17:27:30.181642 containerd[1675]: time="2026-01-23T17:27:30.181583755Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:27:30.182744 containerd[1675]: time="2026-01-23T17:27:30.182691481Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 17:27:30.182833 containerd[1675]: time="2026-01-23T17:27:30.182792201Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 17:27:30.183014 kubelet[2900]: E0123 17:27:30.182982 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 17:27:30.183327 kubelet[2900]: E0123 17:27:30.183026 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 17:27:30.183327 kubelet[2900]: E0123 17:27:30.183093 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-p68bw_calico-system(baa2fc5a-f231-4d23-992e-37e27c865a7c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 17:27:30.184817 containerd[1675]: time="2026-01-23T17:27:30.184676611Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 17:27:30.405524 systemd-networkd[1585]: cali405ca985158: Gained IPv6LL Jan 23 17:27:30.505746 containerd[1675]: time="2026-01-23T17:27:30.505463144Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:27:30.507703 containerd[1675]: time="2026-01-23T17:27:30.507643155Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 17:27:30.507973 containerd[1675]: time="2026-01-23T17:27:30.507724075Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 17:27:30.508212 kubelet[2900]: E0123 17:27:30.508124 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 17:27:30.508212 kubelet[2900]: E0123 17:27:30.508196 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 17:27:30.508453 kubelet[2900]: E0123 17:27:30.508434 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-p68bw_calico-system(baa2fc5a-f231-4d23-992e-37e27c865a7c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 17:27:30.508654 kubelet[2900]: E0123 17:27:30.508629 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:27:30.661551 systemd-networkd[1585]: vxlan.calico: Gained IPv6LL Jan 23 17:27:30.726737 kubelet[2900]: E0123 17:27:30.726694 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:27:30.727909 kubelet[2900]: E0123 17:27:30.727817 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:27:30.799000 audit[4888]: NETFILTER_CFG table=filter:129 family=2 entries=14 op=nft_register_rule pid=4888 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:30.799000 audit[4888]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff821bf40 a2=0 a3=1 items=0 ppid=3013 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:30.799000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:30.812000 audit[4888]: NETFILTER_CFG table=nat:130 family=2 entries=56 op=nft_register_chain pid=4888 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:30.812000 audit[4888]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=fffff821bf40 a2=0 a3=1 items=0 ppid=3013 pid=4888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:30.812000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:31.110425 systemd-networkd[1585]: cali1551446b57b: Gained IPv6LL Jan 23 17:27:31.730443 kubelet[2900]: E0123 17:27:31.730398 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:27:36.564121 containerd[1675]: time="2026-01-23T17:27:36.564069465Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-ssq4v,Uid:4e4a5b57-4234-49d1-b775-45759cc5cd06,Namespace:calico-system,Attempt:0,}" Jan 23 17:27:36.665728 systemd-networkd[1585]: cali33c023445da: Link UP Jan 23 17:27:36.665899 systemd-networkd[1585]: cali33c023445da: Gained carrier Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.599 [INFO][4900] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--a--d0877fd079-k8s-goldmane--7c778bb748--ssq4v-eth0 goldmane-7c778bb748- calico-system 4e4a5b57-4234-49d1-b775-45759cc5cd06 845 0 2026-01-23 17:27:03 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4547-1-0-a-d0877fd079 goldmane-7c778bb748-ssq4v eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali33c023445da [] [] }} ContainerID="7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" Namespace="calico-system" Pod="goldmane-7c778bb748-ssq4v" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-goldmane--7c778bb748--ssq4v-" Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.599 [INFO][4900] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" Namespace="calico-system" Pod="goldmane-7c778bb748-ssq4v" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-goldmane--7c778bb748--ssq4v-eth0" Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.622 [INFO][4915] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" HandleID="k8s-pod-network.7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" Workload="ci--4547--1--0--a--d0877fd079-k8s-goldmane--7c778bb748--ssq4v-eth0" Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.622 [INFO][4915] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" HandleID="k8s-pod-network.7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" Workload="ci--4547--1--0--a--d0877fd079-k8s-goldmane--7c778bb748--ssq4v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c2060), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-1-0-a-d0877fd079", "pod":"goldmane-7c778bb748-ssq4v", "timestamp":"2026-01-23 17:27:36.62226359 +0000 UTC"}, Hostname:"ci-4547-1-0-a-d0877fd079", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.622 [INFO][4915] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.622 [INFO][4915] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.622 [INFO][4915] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-a-d0877fd079' Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.633 [INFO][4915] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.638 [INFO][4915] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.642 [INFO][4915] ipam/ipam.go 511: Trying affinity for 192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.644 [INFO][4915] ipam/ipam.go 158: Attempting to load block cidr=192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.649 [INFO][4915] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.649 [INFO][4915] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.16.0/26 handle="k8s-pod-network.7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.651 [INFO][4915] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.655 [INFO][4915] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.16.0/26 handle="k8s-pod-network.7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.661 [INFO][4915] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.16.6/26] block=192.168.16.0/26 handle="k8s-pod-network.7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.662 [INFO][4915] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.16.6/26] handle="k8s-pod-network.7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.662 [INFO][4915] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 17:27:36.688431 containerd[1675]: 2026-01-23 17:27:36.662 [INFO][4915] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.16.6/26] IPv6=[] ContainerID="7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" HandleID="k8s-pod-network.7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" Workload="ci--4547--1--0--a--d0877fd079-k8s-goldmane--7c778bb748--ssq4v-eth0" Jan 23 17:27:36.688977 containerd[1675]: 2026-01-23 17:27:36.664 [INFO][4900] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" Namespace="calico-system" Pod="goldmane-7c778bb748-ssq4v" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-goldmane--7c778bb748--ssq4v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--a--d0877fd079-k8s-goldmane--7c778bb748--ssq4v-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"4e4a5b57-4234-49d1-b775-45759cc5cd06", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 17, 27, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-a-d0877fd079", ContainerID:"", Pod:"goldmane-7c778bb748-ssq4v", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.16.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali33c023445da", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 17:27:36.688977 containerd[1675]: 2026-01-23 17:27:36.664 [INFO][4900] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.16.6/32] ContainerID="7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" Namespace="calico-system" Pod="goldmane-7c778bb748-ssq4v" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-goldmane--7c778bb748--ssq4v-eth0" Jan 23 17:27:36.688977 containerd[1675]: 2026-01-23 17:27:36.664 [INFO][4900] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali33c023445da ContainerID="7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" Namespace="calico-system" Pod="goldmane-7c778bb748-ssq4v" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-goldmane--7c778bb748--ssq4v-eth0" Jan 23 17:27:36.688977 containerd[1675]: 2026-01-23 17:27:36.666 [INFO][4900] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" Namespace="calico-system" Pod="goldmane-7c778bb748-ssq4v" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-goldmane--7c778bb748--ssq4v-eth0" Jan 23 17:27:36.688977 containerd[1675]: 2026-01-23 17:27:36.666 [INFO][4900] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" Namespace="calico-system" Pod="goldmane-7c778bb748-ssq4v" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-goldmane--7c778bb748--ssq4v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--a--d0877fd079-k8s-goldmane--7c778bb748--ssq4v-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"4e4a5b57-4234-49d1-b775-45759cc5cd06", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 17, 27, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-a-d0877fd079", ContainerID:"7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e", Pod:"goldmane-7c778bb748-ssq4v", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.16.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali33c023445da", MAC:"c2:19:d2:cb:5b:a3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 17:27:36.688977 containerd[1675]: 2026-01-23 17:27:36.680 [INFO][4900] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" Namespace="calico-system" Pod="goldmane-7c778bb748-ssq4v" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-goldmane--7c778bb748--ssq4v-eth0" Jan 23 17:27:36.728299 containerd[1675]: time="2026-01-23T17:27:36.728234790Z" level=info msg="connecting to shim 7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e" address="unix:///run/containerd/s/0a7dce6762955d572d5571c105b59c623b5bd752ca64f0aeea21a1e938795f3f" namespace=k8s.io protocol=ttrpc version=3 Jan 23 17:27:36.736232 kernel: kauditd_printk_skb: 381 callbacks suppressed Jan 23 17:27:36.736360 kernel: audit: type=1325 audit(1769189256.732:706): table=filter:131 family=2 entries=60 op=nft_register_chain pid=4950 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 17:27:36.732000 audit[4950]: NETFILTER_CFG table=filter:131 family=2 entries=60 op=nft_register_chain pid=4950 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 17:27:36.732000 audit[4950]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=29932 a0=3 a1=fffff321a270 a2=0 a3=ffffbaf92fa8 items=0 ppid=4345 pid=4950 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:36.740251 kernel: audit: type=1300 audit(1769189256.732:706): arch=c00000b7 syscall=211 success=yes exit=29932 a0=3 a1=fffff321a270 a2=0 a3=ffffbaf92fa8 items=0 ppid=4345 pid=4950 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:36.743499 kernel: audit: type=1327 audit(1769189256.732:706): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 17:27:36.732000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 17:27:36.753508 systemd[1]: Started cri-containerd-7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e.scope - libcontainer container 7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e. Jan 23 17:27:36.762000 audit: BPF prog-id=241 op=LOAD Jan 23 17:27:36.762000 audit: BPF prog-id=242 op=LOAD Jan 23 17:27:36.764541 kernel: audit: type=1334 audit(1769189256.762:707): prog-id=241 op=LOAD Jan 23 17:27:36.764586 kernel: audit: type=1334 audit(1769189256.762:708): prog-id=242 op=LOAD Jan 23 17:27:36.764605 kernel: audit: type=1300 audit(1769189256.762:708): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4942 pid=4954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:36.762000 audit[4954]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4942 pid=4954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:36.762000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765386436373330393533333633636262363535643366303661386135 Jan 23 17:27:36.770517 kernel: audit: type=1327 audit(1769189256.762:708): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765386436373330393533333633636262363535643366303661386135 Jan 23 17:27:36.763000 audit: BPF prog-id=242 op=UNLOAD Jan 23 17:27:36.763000 audit[4954]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4942 pid=4954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:36.774104 kernel: audit: type=1334 audit(1769189256.763:709): prog-id=242 op=UNLOAD Jan 23 17:27:36.774169 kernel: audit: type=1300 audit(1769189256.763:709): arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4942 pid=4954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:36.763000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765386436373330393533333633636262363535643366303661386135 Jan 23 17:27:36.777093 kernel: audit: type=1327 audit(1769189256.763:709): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765386436373330393533333633636262363535643366303661386135 Jan 23 17:27:36.763000 audit: BPF prog-id=243 op=LOAD Jan 23 17:27:36.763000 audit[4954]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4942 pid=4954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:36.763000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765386436373330393533333633636262363535643366303661386135 Jan 23 17:27:36.764000 audit: BPF prog-id=244 op=LOAD Jan 23 17:27:36.764000 audit[4954]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4942 pid=4954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:36.764000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765386436373330393533333633636262363535643366303661386135 Jan 23 17:27:36.770000 audit: BPF prog-id=244 op=UNLOAD Jan 23 17:27:36.770000 audit[4954]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4942 pid=4954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:36.770000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765386436373330393533333633636262363535643366303661386135 Jan 23 17:27:36.770000 audit: BPF prog-id=243 op=UNLOAD Jan 23 17:27:36.770000 audit[4954]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4942 pid=4954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:36.770000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765386436373330393533333633636262363535643366303661386135 Jan 23 17:27:36.770000 audit: BPF prog-id=245 op=LOAD Jan 23 17:27:36.770000 audit[4954]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4942 pid=4954 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:36.770000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3765386436373330393533333633636262363535643366303661386135 Jan 23 17:27:36.798570 containerd[1675]: time="2026-01-23T17:27:36.798531295Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-ssq4v,Uid:4e4a5b57-4234-49d1-b775-45759cc5cd06,Namespace:calico-system,Attempt:0,} returns sandbox id \"7e8d6730953363cbb655d3f06a8a546afb0ddb86656b2dfc7456d1356e16009e\"" Jan 23 17:27:36.801075 containerd[1675]: time="2026-01-23T17:27:36.801035467Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 17:27:37.144645 containerd[1675]: time="2026-01-23T17:27:37.144517032Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:27:37.145901 containerd[1675]: time="2026-01-23T17:27:37.145841399Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 17:27:37.146007 containerd[1675]: time="2026-01-23T17:27:37.145945039Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 17:27:37.146183 kubelet[2900]: E0123 17:27:37.146123 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 17:27:37.146183 kubelet[2900]: E0123 17:27:37.146173 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 17:27:37.146674 kubelet[2900]: E0123 17:27:37.146246 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-ssq4v_calico-system(4e4a5b57-4234-49d1-b775-45759cc5cd06): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 17:27:37.146674 kubelet[2900]: E0123 17:27:37.146276 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:27:37.563428 containerd[1675]: time="2026-01-23T17:27:37.563295407Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dd7cc585d-hrzqq,Uid:bdc6b38c-9de0-4613-a814-03b925209707,Namespace:calico-apiserver,Attempt:0,}" Jan 23 17:27:37.662360 systemd-networkd[1585]: calib3ac3ed0cff: Link UP Jan 23 17:27:37.662589 systemd-networkd[1585]: calib3ac3ed0cff: Gained carrier Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.597 [INFO][4980] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--hrzqq-eth0 calico-apiserver-dd7cc585d- calico-apiserver bdc6b38c-9de0-4613-a814-03b925209707 842 0 2026-01-23 17:26:57 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:dd7cc585d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-1-0-a-d0877fd079 calico-apiserver-dd7cc585d-hrzqq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib3ac3ed0cff [] [] }} ContainerID="d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" Namespace="calico-apiserver" Pod="calico-apiserver-dd7cc585d-hrzqq" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--hrzqq-" Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.597 [INFO][4980] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" Namespace="calico-apiserver" Pod="calico-apiserver-dd7cc585d-hrzqq" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--hrzqq-eth0" Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.620 [INFO][4996] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" HandleID="k8s-pod-network.d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" Workload="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--hrzqq-eth0" Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.620 [INFO][4996] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" HandleID="k8s-pod-network.d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" Workload="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--hrzqq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001a3690), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-1-0-a-d0877fd079", "pod":"calico-apiserver-dd7cc585d-hrzqq", "timestamp":"2026-01-23 17:27:37.620091725 +0000 UTC"}, Hostname:"ci-4547-1-0-a-d0877fd079", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.620 [INFO][4996] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.620 [INFO][4996] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.620 [INFO][4996] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-a-d0877fd079' Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.629 [INFO][4996] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.634 [INFO][4996] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.640 [INFO][4996] ipam/ipam.go 511: Trying affinity for 192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.641 [INFO][4996] ipam/ipam.go 158: Attempting to load block cidr=192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.643 [INFO][4996] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.643 [INFO][4996] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.16.0/26 handle="k8s-pod-network.d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.645 [INFO][4996] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9 Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.649 [INFO][4996] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.16.0/26 handle="k8s-pod-network.d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.655 [INFO][4996] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.16.7/26] block=192.168.16.0/26 handle="k8s-pod-network.d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.655 [INFO][4996] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.16.7/26] handle="k8s-pod-network.d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.655 [INFO][4996] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 17:27:37.674210 containerd[1675]: 2026-01-23 17:27:37.655 [INFO][4996] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.16.7/26] IPv6=[] ContainerID="d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" HandleID="k8s-pod-network.d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" Workload="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--hrzqq-eth0" Jan 23 17:27:37.675044 containerd[1675]: 2026-01-23 17:27:37.657 [INFO][4980] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" Namespace="calico-apiserver" Pod="calico-apiserver-dd7cc585d-hrzqq" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--hrzqq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--hrzqq-eth0", GenerateName:"calico-apiserver-dd7cc585d-", Namespace:"calico-apiserver", SelfLink:"", UID:"bdc6b38c-9de0-4613-a814-03b925209707", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 17, 26, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"dd7cc585d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-a-d0877fd079", ContainerID:"", Pod:"calico-apiserver-dd7cc585d-hrzqq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.16.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib3ac3ed0cff", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 17:27:37.675044 containerd[1675]: 2026-01-23 17:27:37.657 [INFO][4980] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.16.7/32] ContainerID="d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" Namespace="calico-apiserver" Pod="calico-apiserver-dd7cc585d-hrzqq" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--hrzqq-eth0" Jan 23 17:27:37.675044 containerd[1675]: 2026-01-23 17:27:37.657 [INFO][4980] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib3ac3ed0cff ContainerID="d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" Namespace="calico-apiserver" Pod="calico-apiserver-dd7cc585d-hrzqq" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--hrzqq-eth0" Jan 23 17:27:37.675044 containerd[1675]: 2026-01-23 17:27:37.659 [INFO][4980] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" Namespace="calico-apiserver" Pod="calico-apiserver-dd7cc585d-hrzqq" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--hrzqq-eth0" Jan 23 17:27:37.675044 containerd[1675]: 2026-01-23 17:27:37.659 [INFO][4980] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" Namespace="calico-apiserver" Pod="calico-apiserver-dd7cc585d-hrzqq" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--hrzqq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--hrzqq-eth0", GenerateName:"calico-apiserver-dd7cc585d-", Namespace:"calico-apiserver", SelfLink:"", UID:"bdc6b38c-9de0-4613-a814-03b925209707", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 17, 26, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"dd7cc585d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-a-d0877fd079", ContainerID:"d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9", Pod:"calico-apiserver-dd7cc585d-hrzqq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.16.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib3ac3ed0cff", MAC:"36:8c:cd:c4:9a:61", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 17:27:37.675044 containerd[1675]: 2026-01-23 17:27:37.671 [INFO][4980] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" Namespace="calico-apiserver" Pod="calico-apiserver-dd7cc585d-hrzqq" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--hrzqq-eth0" Jan 23 17:27:37.693862 containerd[1675]: time="2026-01-23T17:27:37.693817767Z" level=info msg="connecting to shim d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9" address="unix:///run/containerd/s/96f01216ad84d4e6233eaae3e611ab4a2f1e486e0d1e6b52153ea277deb766eb" namespace=k8s.io protocol=ttrpc version=3 Jan 23 17:27:37.694000 audit[5014]: NETFILTER_CFG table=filter:132 family=2 entries=70 op=nft_register_chain pid=5014 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 17:27:37.694000 audit[5014]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=34148 a0=3 a1=ffffd465ab90 a2=0 a3=ffffb04cbfa8 items=0 ppid=4345 pid=5014 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:37.694000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 17:27:37.718765 systemd[1]: Started cri-containerd-d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9.scope - libcontainer container d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9. Jan 23 17:27:37.728000 audit: BPF prog-id=246 op=LOAD Jan 23 17:27:37.730000 audit: BPF prog-id=247 op=LOAD Jan 23 17:27:37.730000 audit[5034]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=5022 pid=5034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:37.730000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6431333164663138613561643638666139346335313063623861363837 Jan 23 17:27:37.730000 audit: BPF prog-id=247 op=UNLOAD Jan 23 17:27:37.730000 audit[5034]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=5022 pid=5034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:37.730000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6431333164663138613561643638666139346335313063623861363837 Jan 23 17:27:37.730000 audit: BPF prog-id=248 op=LOAD Jan 23 17:27:37.730000 audit[5034]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=5022 pid=5034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:37.730000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6431333164663138613561643638666139346335313063623861363837 Jan 23 17:27:37.730000 audit: BPF prog-id=249 op=LOAD Jan 23 17:27:37.730000 audit[5034]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=5022 pid=5034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:37.730000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6431333164663138613561643638666139346335313063623861363837 Jan 23 17:27:37.730000 audit: BPF prog-id=249 op=UNLOAD Jan 23 17:27:37.730000 audit[5034]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=5022 pid=5034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:37.730000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6431333164663138613561643638666139346335313063623861363837 Jan 23 17:27:37.730000 audit: BPF prog-id=248 op=UNLOAD Jan 23 17:27:37.730000 audit[5034]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=5022 pid=5034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:37.730000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6431333164663138613561643638666139346335313063623861363837 Jan 23 17:27:37.730000 audit: BPF prog-id=250 op=LOAD Jan 23 17:27:37.730000 audit[5034]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=5022 pid=5034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:37.730000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6431333164663138613561643638666139346335313063623861363837 Jan 23 17:27:37.746425 kubelet[2900]: E0123 17:27:37.746374 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:27:37.761493 containerd[1675]: time="2026-01-23T17:27:37.761447979Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dd7cc585d-hrzqq,Uid:bdc6b38c-9de0-4613-a814-03b925209707,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d131df18a5ad68fa94c510cb8a687e4bb8838604aa9c3ffe3b9c58264312fcf9\"" Jan 23 17:27:37.764325 containerd[1675]: time="2026-01-23T17:27:37.764163712Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 17:27:37.773000 audit[5060]: NETFILTER_CFG table=filter:133 family=2 entries=14 op=nft_register_rule pid=5060 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:37.773000 audit[5060]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffdaddfc10 a2=0 a3=1 items=0 ppid=3013 pid=5060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:37.773000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:37.782000 audit[5060]: NETFILTER_CFG table=nat:134 family=2 entries=20 op=nft_register_rule pid=5060 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:37.782000 audit[5060]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffdaddfc10 a2=0 a3=1 items=0 ppid=3013 pid=5060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:37.782000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:38.077209 containerd[1675]: time="2026-01-23T17:27:38.077157287Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:27:38.078389 containerd[1675]: time="2026-01-23T17:27:38.078345293Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 17:27:38.078481 containerd[1675]: time="2026-01-23T17:27:38.078427454Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 17:27:38.078602 kubelet[2900]: E0123 17:27:38.078570 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:27:38.078602 kubelet[2900]: E0123 17:27:38.078611 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:27:38.078701 kubelet[2900]: E0123 17:27:38.078683 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-dd7cc585d-hrzqq_calico-apiserver(bdc6b38c-9de0-4613-a814-03b925209707): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 17:27:38.078729 kubelet[2900]: E0123 17:27:38.078712 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:27:38.278003 systemd-networkd[1585]: cali33c023445da: Gained IPv6LL Jan 23 17:27:38.563785 containerd[1675]: time="2026-01-23T17:27:38.563744914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dd7cc585d-gp992,Uid:bad712f1-634f-4629-aa4d-a4a0636cb622,Namespace:calico-apiserver,Attempt:0,}" Jan 23 17:27:38.663568 systemd-networkd[1585]: cali4909f36b1aa: Link UP Jan 23 17:27:38.664224 systemd-networkd[1585]: cali4909f36b1aa: Gained carrier Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.600 [INFO][5062] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--gp992-eth0 calico-apiserver-dd7cc585d- calico-apiserver bad712f1-634f-4629-aa4d-a4a0636cb622 841 0 2026-01-23 17:26:57 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:dd7cc585d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-1-0-a-d0877fd079 calico-apiserver-dd7cc585d-gp992 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali4909f36b1aa [] [] }} ContainerID="72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" Namespace="calico-apiserver" Pod="calico-apiserver-dd7cc585d-gp992" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--gp992-" Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.600 [INFO][5062] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" Namespace="calico-apiserver" Pod="calico-apiserver-dd7cc585d-gp992" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--gp992-eth0" Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.621 [INFO][5079] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" HandleID="k8s-pod-network.72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" Workload="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--gp992-eth0" Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.621 [INFO][5079] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" HandleID="k8s-pod-network.72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" Workload="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--gp992-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3180), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-1-0-a-d0877fd079", "pod":"calico-apiserver-dd7cc585d-gp992", "timestamp":"2026-01-23 17:27:38.621206196 +0000 UTC"}, Hostname:"ci-4547-1-0-a-d0877fd079", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.621 [INFO][5079] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.621 [INFO][5079] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.621 [INFO][5079] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-1-0-a-d0877fd079' Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.632 [INFO][5079] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.636 [INFO][5079] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.641 [INFO][5079] ipam/ipam.go 511: Trying affinity for 192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.642 [INFO][5079] ipam/ipam.go 158: Attempting to load block cidr=192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.645 [INFO][5079] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.16.0/26 host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.645 [INFO][5079] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.16.0/26 handle="k8s-pod-network.72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.647 [INFO][5079] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.650 [INFO][5079] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.16.0/26 handle="k8s-pod-network.72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.659 [INFO][5079] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.16.8/26] block=192.168.16.0/26 handle="k8s-pod-network.72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.659 [INFO][5079] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.16.8/26] handle="k8s-pod-network.72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" host="ci-4547-1-0-a-d0877fd079" Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.659 [INFO][5079] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 23 17:27:38.677292 containerd[1675]: 2026-01-23 17:27:38.659 [INFO][5079] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.16.8/26] IPv6=[] ContainerID="72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" HandleID="k8s-pod-network.72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" Workload="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--gp992-eth0" Jan 23 17:27:38.678000 containerd[1675]: 2026-01-23 17:27:38.661 [INFO][5062] cni-plugin/k8s.go 418: Populated endpoint ContainerID="72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" Namespace="calico-apiserver" Pod="calico-apiserver-dd7cc585d-gp992" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--gp992-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--gp992-eth0", GenerateName:"calico-apiserver-dd7cc585d-", Namespace:"calico-apiserver", SelfLink:"", UID:"bad712f1-634f-4629-aa4d-a4a0636cb622", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 17, 26, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"dd7cc585d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-a-d0877fd079", ContainerID:"", Pod:"calico-apiserver-dd7cc585d-gp992", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.16.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4909f36b1aa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 17:27:38.678000 containerd[1675]: 2026-01-23 17:27:38.661 [INFO][5062] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.16.8/32] ContainerID="72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" Namespace="calico-apiserver" Pod="calico-apiserver-dd7cc585d-gp992" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--gp992-eth0" Jan 23 17:27:38.678000 containerd[1675]: 2026-01-23 17:27:38.661 [INFO][5062] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4909f36b1aa ContainerID="72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" Namespace="calico-apiserver" Pod="calico-apiserver-dd7cc585d-gp992" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--gp992-eth0" Jan 23 17:27:38.678000 containerd[1675]: 2026-01-23 17:27:38.664 [INFO][5062] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" Namespace="calico-apiserver" Pod="calico-apiserver-dd7cc585d-gp992" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--gp992-eth0" Jan 23 17:27:38.678000 containerd[1675]: 2026-01-23 17:27:38.664 [INFO][5062] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" Namespace="calico-apiserver" Pod="calico-apiserver-dd7cc585d-gp992" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--gp992-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--gp992-eth0", GenerateName:"calico-apiserver-dd7cc585d-", Namespace:"calico-apiserver", SelfLink:"", UID:"bad712f1-634f-4629-aa4d-a4a0636cb622", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2026, time.January, 23, 17, 26, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"dd7cc585d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-1-0-a-d0877fd079", ContainerID:"72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f", Pod:"calico-apiserver-dd7cc585d-gp992", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.16.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4909f36b1aa", MAC:"22:04:5e:ef:8d:95", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 23 17:27:38.678000 containerd[1675]: 2026-01-23 17:27:38.673 [INFO][5062] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" Namespace="calico-apiserver" Pod="calico-apiserver-dd7cc585d-gp992" WorkloadEndpoint="ci--4547--1--0--a--d0877fd079-k8s-calico--apiserver--dd7cc585d--gp992-eth0" Jan 23 17:27:38.690000 audit[5095]: NETFILTER_CFG table=filter:135 family=2 entries=67 op=nft_register_chain pid=5095 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 23 17:27:38.690000 audit[5095]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=31868 a0=3 a1=ffffc9a43e90 a2=0 a3=ffff9ca66fa8 items=0 ppid=4345 pid=5095 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:38.690000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 23 17:27:38.700891 containerd[1675]: time="2026-01-23T17:27:38.700850187Z" level=info msg="connecting to shim 72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f" address="unix:///run/containerd/s/beea0dc4e6493ff4f4a069bcc0341be8767f485544c6f01c8e1fa7f2fafcefd8" namespace=k8s.io protocol=ttrpc version=3 Jan 23 17:27:38.725516 systemd[1]: Started cri-containerd-72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f.scope - libcontainer container 72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f. Jan 23 17:27:38.734000 audit: BPF prog-id=251 op=LOAD Jan 23 17:27:38.735000 audit: BPF prog-id=252 op=LOAD Jan 23 17:27:38.735000 audit[5116]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000138180 a2=98 a3=0 items=0 ppid=5104 pid=5116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:38.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732633365356231656238313866336166396637653639336133653437 Jan 23 17:27:38.735000 audit: BPF prog-id=252 op=UNLOAD Jan 23 17:27:38.735000 audit[5116]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=5104 pid=5116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:38.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732633365356231656238313866336166396637653639336133653437 Jan 23 17:27:38.735000 audit: BPF prog-id=253 op=LOAD Jan 23 17:27:38.735000 audit[5116]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001383e8 a2=98 a3=0 items=0 ppid=5104 pid=5116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:38.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732633365356231656238313866336166396637653639336133653437 Jan 23 17:27:38.735000 audit: BPF prog-id=254 op=LOAD Jan 23 17:27:38.735000 audit[5116]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000138168 a2=98 a3=0 items=0 ppid=5104 pid=5116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:38.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732633365356231656238313866336166396637653639336133653437 Jan 23 17:27:38.735000 audit: BPF prog-id=254 op=UNLOAD Jan 23 17:27:38.735000 audit[5116]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=5104 pid=5116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:38.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732633365356231656238313866336166396637653639336133653437 Jan 23 17:27:38.735000 audit: BPF prog-id=253 op=UNLOAD Jan 23 17:27:38.735000 audit[5116]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=5104 pid=5116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:38.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732633365356231656238313866336166396637653639336133653437 Jan 23 17:27:38.735000 audit: BPF prog-id=255 op=LOAD Jan 23 17:27:38.735000 audit[5116]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000138648 a2=98 a3=0 items=0 ppid=5104 pid=5116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:38.735000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732633365356231656238313866336166396637653639336133653437 Jan 23 17:27:38.749455 kubelet[2900]: E0123 17:27:38.748922 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:27:38.750428 kubelet[2900]: E0123 17:27:38.750387 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:27:38.766867 containerd[1675]: time="2026-01-23T17:27:38.766829470Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dd7cc585d-gp992,Uid:bad712f1-634f-4629-aa4d-a4a0636cb622,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"72c3e5b1eb818f3af9f7e693a3e47e20b0befb68879faef86cbbefa7a198df9f\"" Jan 23 17:27:38.769472 containerd[1675]: time="2026-01-23T17:27:38.769437763Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 17:27:38.806000 audit[5143]: NETFILTER_CFG table=filter:136 family=2 entries=14 op=nft_register_rule pid=5143 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:38.806000 audit[5143]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff4466f00 a2=0 a3=1 items=0 ppid=3013 pid=5143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:38.806000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:38.818000 audit[5143]: NETFILTER_CFG table=nat:137 family=2 entries=20 op=nft_register_rule pid=5143 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:38.818000 audit[5143]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=fffff4466f00 a2=0 a3=1 items=0 ppid=3013 pid=5143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:38.818000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:39.314665 containerd[1675]: time="2026-01-23T17:27:39.314458117Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:27:39.316323 containerd[1675]: time="2026-01-23T17:27:39.316240926Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 17:27:39.316436 containerd[1675]: time="2026-01-23T17:27:39.316282206Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 17:27:39.316508 kubelet[2900]: E0123 17:27:39.316464 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:27:39.316551 kubelet[2900]: E0123 17:27:39.316515 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:27:39.316597 kubelet[2900]: E0123 17:27:39.316579 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-dd7cc585d-gp992_calico-apiserver(bad712f1-634f-4629-aa4d-a4a0636cb622): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 17:27:39.316624 kubelet[2900]: E0123 17:27:39.316610 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:27:39.493506 systemd-networkd[1585]: calib3ac3ed0cff: Gained IPv6LL Jan 23 17:27:39.750573 systemd-networkd[1585]: cali4909f36b1aa: Gained IPv6LL Jan 23 17:27:39.752982 kubelet[2900]: E0123 17:27:39.752831 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:27:39.753812 kubelet[2900]: E0123 17:27:39.753761 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:27:39.844000 audit[5151]: NETFILTER_CFG table=filter:138 family=2 entries=14 op=nft_register_rule pid=5151 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:39.844000 audit[5151]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffd8e96f90 a2=0 a3=1 items=0 ppid=3013 pid=5151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:39.844000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:39.850000 audit[5151]: NETFILTER_CFG table=nat:139 family=2 entries=20 op=nft_register_rule pid=5151 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:27:39.850000 audit[5151]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffd8e96f90 a2=0 a3=1 items=0 ppid=3013 pid=5151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:27:39.850000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:27:40.754472 kubelet[2900]: E0123 17:27:40.754418 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:27:42.561410 containerd[1675]: time="2026-01-23T17:27:42.561297724Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 17:27:42.923857 containerd[1675]: time="2026-01-23T17:27:42.923741302Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:27:42.925200 containerd[1675]: time="2026-01-23T17:27:42.925136229Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 17:27:42.925315 containerd[1675]: time="2026-01-23T17:27:42.925223950Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 17:27:42.925432 kubelet[2900]: E0123 17:27:42.925381 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 17:27:42.925432 kubelet[2900]: E0123 17:27:42.925428 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 17:27:42.925809 kubelet[2900]: E0123 17:27:42.925496 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5bcfdccd6d-ph9sw_calico-system(d8d63139-4f9a-445f-a5a9-c69b14220343): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 17:27:42.925809 kubelet[2900]: E0123 17:27:42.925527 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:27:44.562065 containerd[1675]: time="2026-01-23T17:27:44.561970899Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 17:27:44.903849 containerd[1675]: time="2026-01-23T17:27:44.903791336Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:27:44.905446 containerd[1675]: time="2026-01-23T17:27:44.905357623Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 17:27:44.905535 containerd[1675]: time="2026-01-23T17:27:44.905422784Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 17:27:44.905606 kubelet[2900]: E0123 17:27:44.905573 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 17:27:44.905910 kubelet[2900]: E0123 17:27:44.905616 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 17:27:44.905910 kubelet[2900]: E0123 17:27:44.905786 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-p68bw_calico-system(baa2fc5a-f231-4d23-992e-37e27c865a7c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 17:27:44.905998 containerd[1675]: time="2026-01-23T17:27:44.905883226Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 17:27:45.252955 containerd[1675]: time="2026-01-23T17:27:45.252695887Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:27:45.254370 containerd[1675]: time="2026-01-23T17:27:45.254253255Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 17:27:45.254370 containerd[1675]: time="2026-01-23T17:27:45.254298415Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 17:27:45.254773 kubelet[2900]: E0123 17:27:45.254534 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 17:27:45.254773 kubelet[2900]: E0123 17:27:45.254577 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 17:27:45.255251 containerd[1675]: time="2026-01-23T17:27:45.255221539Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 17:27:45.255399 kubelet[2900]: E0123 17:27:45.255347 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-68b765b4fc-4fz2b_calico-system(348a8202-26ae-4e13-80ce-0b3caa537034): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 17:27:45.594264 containerd[1675]: time="2026-01-23T17:27:45.593935481Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:27:45.595575 containerd[1675]: time="2026-01-23T17:27:45.595507009Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 17:27:45.595655 containerd[1675]: time="2026-01-23T17:27:45.595549489Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 17:27:45.595848 kubelet[2900]: E0123 17:27:45.595792 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 17:27:45.595919 kubelet[2900]: E0123 17:27:45.595843 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 17:27:45.596035 kubelet[2900]: E0123 17:27:45.595997 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-p68bw_calico-system(baa2fc5a-f231-4d23-992e-37e27c865a7c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 17:27:45.596127 kubelet[2900]: E0123 17:27:45.596052 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:27:45.596187 containerd[1675]: time="2026-01-23T17:27:45.596135692Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 17:27:45.934233 containerd[1675]: time="2026-01-23T17:27:45.934139910Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:27:45.935647 containerd[1675]: time="2026-01-23T17:27:45.935584557Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 17:27:45.935647 containerd[1675]: time="2026-01-23T17:27:45.935614797Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 17:27:45.935854 kubelet[2900]: E0123 17:27:45.935802 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 17:27:45.935854 kubelet[2900]: E0123 17:27:45.935853 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 17:27:45.936135 kubelet[2900]: E0123 17:27:45.935925 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-68b765b4fc-4fz2b_calico-system(348a8202-26ae-4e13-80ce-0b3caa537034): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 17:27:45.936135 kubelet[2900]: E0123 17:27:45.935960 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68b765b4fc-4fz2b" podUID="348a8202-26ae-4e13-80ce-0b3caa537034" Jan 23 17:27:51.561513 containerd[1675]: time="2026-01-23T17:27:51.561265474Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 17:27:51.944529 containerd[1675]: time="2026-01-23T17:27:51.944427234Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:27:51.945559 containerd[1675]: time="2026-01-23T17:27:51.945527319Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 17:27:51.945658 containerd[1675]: time="2026-01-23T17:27:51.945608279Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 17:27:51.945791 kubelet[2900]: E0123 17:27:51.945756 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 17:27:51.946063 kubelet[2900]: E0123 17:27:51.945802 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 17:27:51.946063 kubelet[2900]: E0123 17:27:51.945874 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-ssq4v_calico-system(4e4a5b57-4234-49d1-b775-45759cc5cd06): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 17:27:51.946063 kubelet[2900]: E0123 17:27:51.945905 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:27:53.561831 containerd[1675]: time="2026-01-23T17:27:53.561613007Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 17:27:53.889782 containerd[1675]: time="2026-01-23T17:27:53.889648776Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:27:53.890780 containerd[1675]: time="2026-01-23T17:27:53.890734141Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 17:27:53.890858 containerd[1675]: time="2026-01-23T17:27:53.890814622Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 17:27:53.891000 kubelet[2900]: E0123 17:27:53.890965 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:27:53.891278 kubelet[2900]: E0123 17:27:53.891011 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:27:53.891335 kubelet[2900]: E0123 17:27:53.891268 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-dd7cc585d-hrzqq_calico-apiserver(bdc6b38c-9de0-4613-a814-03b925209707): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 17:27:53.891379 kubelet[2900]: E0123 17:27:53.891343 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:27:53.892208 containerd[1675]: time="2026-01-23T17:27:53.892131948Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 17:27:54.231341 containerd[1675]: time="2026-01-23T17:27:54.231220332Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:27:54.232769 containerd[1675]: time="2026-01-23T17:27:54.232585938Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 17:27:54.232769 containerd[1675]: time="2026-01-23T17:27:54.232612418Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 17:27:54.232899 kubelet[2900]: E0123 17:27:54.232795 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:27:54.232899 kubelet[2900]: E0123 17:27:54.232833 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:27:54.232990 kubelet[2900]: E0123 17:27:54.232892 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-dd7cc585d-gp992_calico-apiserver(bad712f1-634f-4629-aa4d-a4a0636cb622): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 17:27:54.232990 kubelet[2900]: E0123 17:27:54.232942 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:27:56.562907 kubelet[2900]: E0123 17:27:56.562850 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:27:57.562514 kubelet[2900]: E0123 17:27:57.562463 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:28:00.563529 kubelet[2900]: E0123 17:28:00.563330 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68b765b4fc-4fz2b" podUID="348a8202-26ae-4e13-80ce-0b3caa537034" Jan 23 17:28:06.565058 kubelet[2900]: E0123 17:28:06.565012 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:28:07.561848 kubelet[2900]: E0123 17:28:07.561800 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:28:09.563596 kubelet[2900]: E0123 17:28:09.563548 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:28:10.561655 containerd[1675]: time="2026-01-23T17:28:10.561578081Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 17:28:10.938205 containerd[1675]: time="2026-01-23T17:28:10.938082728Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:28:10.939863 containerd[1675]: time="2026-01-23T17:28:10.939812216Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 17:28:10.939920 containerd[1675]: time="2026-01-23T17:28:10.939867576Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 17:28:10.940079 kubelet[2900]: E0123 17:28:10.940039 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 17:28:10.940359 kubelet[2900]: E0123 17:28:10.940090 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 17:28:10.940477 kubelet[2900]: E0123 17:28:10.940168 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5bcfdccd6d-ph9sw_calico-system(d8d63139-4f9a-445f-a5a9-c69b14220343): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 17:28:10.940523 kubelet[2900]: E0123 17:28:10.940500 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:28:11.562408 containerd[1675]: time="2026-01-23T17:28:11.562005548Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 17:28:11.893290 containerd[1675]: time="2026-01-23T17:28:11.893166933Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:28:11.895611 containerd[1675]: time="2026-01-23T17:28:11.895382464Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 17:28:11.895611 containerd[1675]: time="2026-01-23T17:28:11.895453504Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 17:28:11.895758 kubelet[2900]: E0123 17:28:11.895607 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 17:28:11.895758 kubelet[2900]: E0123 17:28:11.895653 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 17:28:11.895858 kubelet[2900]: E0123 17:28:11.895834 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-p68bw_calico-system(baa2fc5a-f231-4d23-992e-37e27c865a7c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 17:28:11.897376 containerd[1675]: time="2026-01-23T17:28:11.895965427Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 17:28:12.228597 containerd[1675]: time="2026-01-23T17:28:12.228281577Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:28:12.230408 containerd[1675]: time="2026-01-23T17:28:12.230363707Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 17:28:12.230665 containerd[1675]: time="2026-01-23T17:28:12.230444507Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 17:28:12.230705 kubelet[2900]: E0123 17:28:12.230604 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 17:28:12.230705 kubelet[2900]: E0123 17:28:12.230675 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 17:28:12.232188 kubelet[2900]: E0123 17:28:12.231905 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-68b765b4fc-4fz2b_calico-system(348a8202-26ae-4e13-80ce-0b3caa537034): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 17:28:12.232436 containerd[1675]: time="2026-01-23T17:28:12.230947070Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 17:28:12.569987 containerd[1675]: time="2026-01-23T17:28:12.569762372Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:28:12.571380 containerd[1675]: time="2026-01-23T17:28:12.571276259Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 17:28:12.571380 containerd[1675]: time="2026-01-23T17:28:12.571331340Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 17:28:12.571514 kubelet[2900]: E0123 17:28:12.571475 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 17:28:12.571581 kubelet[2900]: E0123 17:28:12.571520 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 17:28:12.571581 kubelet[2900]: E0123 17:28:12.571677 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-p68bw_calico-system(baa2fc5a-f231-4d23-992e-37e27c865a7c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 17:28:12.571581 kubelet[2900]: E0123 17:28:12.571719 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:28:12.571911 containerd[1675]: time="2026-01-23T17:28:12.571752622Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 17:28:12.907849 containerd[1675]: time="2026-01-23T17:28:12.907795590Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:28:12.909682 containerd[1675]: time="2026-01-23T17:28:12.909639079Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 17:28:12.909784 containerd[1675]: time="2026-01-23T17:28:12.909721880Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 17:28:12.909984 kubelet[2900]: E0123 17:28:12.909946 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 17:28:12.910036 kubelet[2900]: E0123 17:28:12.909992 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 17:28:12.910088 kubelet[2900]: E0123 17:28:12.910070 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-68b765b4fc-4fz2b_calico-system(348a8202-26ae-4e13-80ce-0b3caa537034): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 17:28:12.910133 kubelet[2900]: E0123 17:28:12.910108 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68b765b4fc-4fz2b" podUID="348a8202-26ae-4e13-80ce-0b3caa537034" Jan 23 17:28:18.562197 containerd[1675]: time="2026-01-23T17:28:18.562148688Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 17:28:18.919431 containerd[1675]: time="2026-01-23T17:28:18.919279600Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:28:18.920740 containerd[1675]: time="2026-01-23T17:28:18.920704767Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 17:28:18.920819 containerd[1675]: time="2026-01-23T17:28:18.920785127Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 17:28:18.920983 kubelet[2900]: E0123 17:28:18.920944 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 17:28:18.921281 kubelet[2900]: E0123 17:28:18.920992 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 17:28:18.921281 kubelet[2900]: E0123 17:28:18.921066 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-ssq4v_calico-system(4e4a5b57-4234-49d1-b775-45759cc5cd06): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 17:28:18.921281 kubelet[2900]: E0123 17:28:18.921099 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:28:19.562369 containerd[1675]: time="2026-01-23T17:28:19.562294994Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 17:28:19.904167 containerd[1675]: time="2026-01-23T17:28:19.904124791Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:28:19.905580 containerd[1675]: time="2026-01-23T17:28:19.905544078Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 17:28:19.905654 containerd[1675]: time="2026-01-23T17:28:19.905586478Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 17:28:19.906676 kubelet[2900]: E0123 17:28:19.906491 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:28:19.906676 kubelet[2900]: E0123 17:28:19.906534 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:28:19.906676 kubelet[2900]: E0123 17:28:19.906606 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-dd7cc585d-gp992_calico-apiserver(bad712f1-634f-4629-aa4d-a4a0636cb622): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 17:28:19.906676 kubelet[2900]: E0123 17:28:19.906634 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:28:21.562555 containerd[1675]: time="2026-01-23T17:28:21.562459326Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 17:28:21.912967 containerd[1675]: time="2026-01-23T17:28:21.912919005Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:28:21.914236 containerd[1675]: time="2026-01-23T17:28:21.914192811Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 17:28:21.914348 containerd[1675]: time="2026-01-23T17:28:21.914270412Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 17:28:21.914447 kubelet[2900]: E0123 17:28:21.914415 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:28:21.914710 kubelet[2900]: E0123 17:28:21.914460 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:28:21.914710 kubelet[2900]: E0123 17:28:21.914532 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-dd7cc585d-hrzqq_calico-apiserver(bdc6b38c-9de0-4613-a814-03b925209707): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 17:28:21.914710 kubelet[2900]: E0123 17:28:21.914561 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:28:22.562860 kubelet[2900]: E0123 17:28:22.562176 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:28:25.563909 kubelet[2900]: E0123 17:28:25.563852 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68b765b4fc-4fz2b" podUID="348a8202-26ae-4e13-80ce-0b3caa537034" Jan 23 17:28:27.562161 kubelet[2900]: E0123 17:28:27.562114 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:28:29.561534 kubelet[2900]: E0123 17:28:29.561416 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:28:32.563224 kubelet[2900]: E0123 17:28:32.563046 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:28:34.561157 kubelet[2900]: E0123 17:28:34.561086 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:28:35.561873 kubelet[2900]: E0123 17:28:35.561773 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:28:38.562368 kubelet[2900]: E0123 17:28:38.562041 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:28:39.562163 kubelet[2900]: E0123 17:28:39.562092 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68b765b4fc-4fz2b" podUID="348a8202-26ae-4e13-80ce-0b3caa537034" Jan 23 17:28:40.561840 kubelet[2900]: E0123 17:28:40.561716 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:28:47.561649 kubelet[2900]: E0123 17:28:47.561580 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:28:47.562712 kubelet[2900]: E0123 17:28:47.562042 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:28:50.561978 kubelet[2900]: E0123 17:28:50.561918 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:28:50.562743 kubelet[2900]: E0123 17:28:50.562379 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:28:51.562052 kubelet[2900]: E0123 17:28:51.561695 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:28:54.562459 containerd[1675]: time="2026-01-23T17:28:54.562416569Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 17:28:54.928494 containerd[1675]: time="2026-01-23T17:28:54.928074002Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:28:54.929553 containerd[1675]: time="2026-01-23T17:28:54.929486569Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 17:28:54.929611 containerd[1675]: time="2026-01-23T17:28:54.929531770Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 17:28:54.929789 kubelet[2900]: E0123 17:28:54.929746 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 17:28:54.930191 kubelet[2900]: E0123 17:28:54.929791 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 17:28:54.930191 kubelet[2900]: E0123 17:28:54.929857 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-68b765b4fc-4fz2b_calico-system(348a8202-26ae-4e13-80ce-0b3caa537034): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 17:28:54.931750 containerd[1675]: time="2026-01-23T17:28:54.931696740Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 17:28:55.277262 containerd[1675]: time="2026-01-23T17:28:55.277136795Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:28:55.279086 containerd[1675]: time="2026-01-23T17:28:55.279017764Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 17:28:55.279213 containerd[1675]: time="2026-01-23T17:28:55.279090564Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 17:28:55.279308 kubelet[2900]: E0123 17:28:55.279249 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 17:28:55.279394 kubelet[2900]: E0123 17:28:55.279301 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 17:28:55.279823 kubelet[2900]: E0123 17:28:55.279388 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-68b765b4fc-4fz2b_calico-system(348a8202-26ae-4e13-80ce-0b3caa537034): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 17:28:55.280753 kubelet[2900]: E0123 17:28:55.279425 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68b765b4fc-4fz2b" podUID="348a8202-26ae-4e13-80ce-0b3caa537034" Jan 23 17:28:58.562087 kubelet[2900]: E0123 17:28:58.561867 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:28:58.934063 systemd[1]: Started sshd@7-10.0.1.37:22-4.153.228.146:35658.service - OpenSSH per-connection server daemon (4.153.228.146:35658). Jan 23 17:28:58.932000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.1.37:22-4.153.228.146:35658 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:28:58.937872 kernel: kauditd_printk_skb: 83 callbacks suppressed Jan 23 17:28:58.938109 kernel: audit: type=1130 audit(1769189338.932:739): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.1.37:22-4.153.228.146:35658 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:28:59.451000 audit[5282]: USER_ACCT pid=5282 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:28:59.453423 sshd[5282]: Accepted publickey for core from 4.153.228.146 port 35658 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:28:59.455000 audit[5282]: CRED_ACQ pid=5282 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:28:59.457537 sshd-session[5282]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:28:59.459465 kernel: audit: type=1101 audit(1769189339.451:740): pid=5282 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:28:59.459607 kernel: audit: type=1103 audit(1769189339.455:741): pid=5282 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:28:59.459679 kernel: audit: type=1006 audit(1769189339.455:742): pid=5282 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=9 res=1 Jan 23 17:28:59.455000 audit[5282]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdd055a40 a2=3 a3=0 items=0 ppid=1 pid=5282 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:28:59.464432 kernel: audit: type=1300 audit(1769189339.455:742): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdd055a40 a2=3 a3=0 items=0 ppid=1 pid=5282 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:28:59.455000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:28:59.466005 kernel: audit: type=1327 audit(1769189339.455:742): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:28:59.468895 systemd-logind[1644]: New session 9 of user core. Jan 23 17:28:59.482730 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 23 17:28:59.485000 audit[5282]: USER_START pid=5282 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:28:59.488000 audit[5286]: CRED_ACQ pid=5286 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:28:59.492657 kernel: audit: type=1105 audit(1769189339.485:743): pid=5282 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:28:59.492755 kernel: audit: type=1103 audit(1769189339.488:744): pid=5286 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:28:59.815930 sshd[5286]: Connection closed by 4.153.228.146 port 35658 Jan 23 17:28:59.815820 sshd-session[5282]: pam_unix(sshd:session): session closed for user core Jan 23 17:28:59.816000 audit[5282]: USER_END pid=5282 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:28:59.819721 systemd[1]: sshd@7-10.0.1.37:22-4.153.228.146:35658.service: Deactivated successfully. Jan 23 17:28:59.816000 audit[5282]: CRED_DISP pid=5282 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:28:59.821513 systemd[1]: session-9.scope: Deactivated successfully. Jan 23 17:28:59.824530 kernel: audit: type=1106 audit(1769189339.816:745): pid=5282 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:28:59.824597 kernel: audit: type=1104 audit(1769189339.816:746): pid=5282 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:28:59.816000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.1.37:22-4.153.228.146:35658 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:28:59.824532 systemd-logind[1644]: Session 9 logged out. Waiting for processes to exit. Jan 23 17:28:59.828762 systemd-logind[1644]: Removed session 9. Jan 23 17:29:00.562436 containerd[1675]: time="2026-01-23T17:29:00.562210001Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 17:29:00.908172 containerd[1675]: time="2026-01-23T17:29:00.908116138Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:29:00.909406 containerd[1675]: time="2026-01-23T17:29:00.909358744Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 17:29:00.909822 containerd[1675]: time="2026-01-23T17:29:00.909474264Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 17:29:00.909891 kubelet[2900]: E0123 17:29:00.909629 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:29:00.909891 kubelet[2900]: E0123 17:29:00.909669 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:29:00.909891 kubelet[2900]: E0123 17:29:00.909743 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-dd7cc585d-gp992_calico-apiserver(bad712f1-634f-4629-aa4d-a4a0636cb622): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 17:29:00.909891 kubelet[2900]: E0123 17:29:00.909773 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:29:02.562510 containerd[1675]: time="2026-01-23T17:29:02.562418093Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 17:29:02.907913 containerd[1675]: time="2026-01-23T17:29:02.907829427Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:29:02.909354 containerd[1675]: time="2026-01-23T17:29:02.909322155Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 17:29:02.909467 containerd[1675]: time="2026-01-23T17:29:02.909355995Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 17:29:02.909625 kubelet[2900]: E0123 17:29:02.909594 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 17:29:02.910343 kubelet[2900]: E0123 17:29:02.909903 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 17:29:02.910343 kubelet[2900]: E0123 17:29:02.909988 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-ssq4v_calico-system(4e4a5b57-4234-49d1-b775-45759cc5cd06): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 17:29:02.910499 kubelet[2900]: E0123 17:29:02.910476 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:29:03.562577 containerd[1675]: time="2026-01-23T17:29:03.562538239Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 17:29:03.914886 containerd[1675]: time="2026-01-23T17:29:03.914835447Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:29:03.917513 containerd[1675]: time="2026-01-23T17:29:03.917464340Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 17:29:03.918388 containerd[1675]: time="2026-01-23T17:29:03.917543221Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 17:29:03.918439 kubelet[2900]: E0123 17:29:03.917691 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 17:29:03.918439 kubelet[2900]: E0123 17:29:03.917741 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 17:29:03.918439 kubelet[2900]: E0123 17:29:03.917806 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5bcfdccd6d-ph9sw_calico-system(d8d63139-4f9a-445f-a5a9-c69b14220343): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 17:29:03.918439 kubelet[2900]: E0123 17:29:03.917836 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:29:04.928854 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 17:29:04.928979 kernel: audit: type=1130 audit(1769189344.924:748): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.1.37:22-4.153.228.146:52574 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:04.924000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.1.37:22-4.153.228.146:52574 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:04.925732 systemd[1]: Started sshd@8-10.0.1.37:22-4.153.228.146:52574.service - OpenSSH per-connection server daemon (4.153.228.146:52574). Jan 23 17:29:05.467000 audit[5305]: USER_ACCT pid=5305 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:05.469502 sshd[5305]: Accepted publickey for core from 4.153.228.146 port 52574 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:29:05.469000 audit[5305]: CRED_ACQ pid=5305 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:05.472602 sshd-session[5305]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:29:05.474556 kernel: audit: type=1101 audit(1769189345.467:749): pid=5305 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:05.474613 kernel: audit: type=1103 audit(1769189345.469:750): pid=5305 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:05.476445 kernel: audit: type=1006 audit(1769189345.469:751): pid=5305 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Jan 23 17:29:05.469000 audit[5305]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff02a1bc0 a2=3 a3=0 items=0 ppid=1 pid=5305 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:05.479867 kernel: audit: type=1300 audit(1769189345.469:751): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff02a1bc0 a2=3 a3=0 items=0 ppid=1 pid=5305 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:05.469000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:05.481357 kernel: audit: type=1327 audit(1769189345.469:751): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:05.482275 systemd-logind[1644]: New session 10 of user core. Jan 23 17:29:05.489657 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 23 17:29:05.490000 audit[5305]: USER_START pid=5305 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:05.492000 audit[5309]: CRED_ACQ pid=5309 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:05.498008 kernel: audit: type=1105 audit(1769189345.490:752): pid=5305 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:05.498066 kernel: audit: type=1103 audit(1769189345.492:753): pid=5309 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:05.561612 containerd[1675]: time="2026-01-23T17:29:05.561573845Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 17:29:05.819321 sshd[5309]: Connection closed by 4.153.228.146 port 52574 Jan 23 17:29:05.819549 sshd-session[5305]: pam_unix(sshd:session): session closed for user core Jan 23 17:29:05.819000 audit[5305]: USER_END pid=5305 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:05.824171 systemd[1]: sshd@8-10.0.1.37:22-4.153.228.146:52574.service: Deactivated successfully. Jan 23 17:29:05.820000 audit[5305]: CRED_DISP pid=5305 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:05.825924 systemd[1]: session-10.scope: Deactivated successfully. Jan 23 17:29:05.828016 kernel: audit: type=1106 audit(1769189345.819:754): pid=5305 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:05.828078 kernel: audit: type=1104 audit(1769189345.820:755): pid=5305 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:05.828077 systemd-logind[1644]: Session 10 logged out. Waiting for processes to exit. Jan 23 17:29:05.822000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.1.37:22-4.153.228.146:52574 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:05.828937 systemd-logind[1644]: Removed session 10. Jan 23 17:29:05.905053 containerd[1675]: time="2026-01-23T17:29:05.904966050Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:29:05.906585 containerd[1675]: time="2026-01-23T17:29:05.906549978Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 17:29:05.906636 containerd[1675]: time="2026-01-23T17:29:05.906587418Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 17:29:05.906911 kubelet[2900]: E0123 17:29:05.906852 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 17:29:05.906911 kubelet[2900]: E0123 17:29:05.906904 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 17:29:05.907223 kubelet[2900]: E0123 17:29:05.906974 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-p68bw_calico-system(baa2fc5a-f231-4d23-992e-37e27c865a7c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 17:29:05.907944 containerd[1675]: time="2026-01-23T17:29:05.907917904Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 17:29:06.246288 containerd[1675]: time="2026-01-23T17:29:06.246197404Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:29:06.248117 containerd[1675]: time="2026-01-23T17:29:06.248043413Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 17:29:06.248204 containerd[1675]: time="2026-01-23T17:29:06.248137253Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 17:29:06.248375 kubelet[2900]: E0123 17:29:06.248339 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 17:29:06.248430 kubelet[2900]: E0123 17:29:06.248387 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 17:29:06.248476 kubelet[2900]: E0123 17:29:06.248458 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-p68bw_calico-system(baa2fc5a-f231-4d23-992e-37e27c865a7c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 17:29:06.248525 kubelet[2900]: E0123 17:29:06.248499 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:29:08.562741 kubelet[2900]: E0123 17:29:08.562321 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68b765b4fc-4fz2b" podUID="348a8202-26ae-4e13-80ce-0b3caa537034" Jan 23 17:29:09.561935 containerd[1675]: time="2026-01-23T17:29:09.561880429Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 17:29:09.944399 containerd[1675]: time="2026-01-23T17:29:09.944177144Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:29:09.946116 containerd[1675]: time="2026-01-23T17:29:09.945981473Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 17:29:09.946116 containerd[1675]: time="2026-01-23T17:29:09.946068674Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 17:29:09.946406 kubelet[2900]: E0123 17:29:09.946361 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:29:09.946713 kubelet[2900]: E0123 17:29:09.946414 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:29:09.946713 kubelet[2900]: E0123 17:29:09.946488 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-dd7cc585d-hrzqq_calico-apiserver(bdc6b38c-9de0-4613-a814-03b925209707): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 17:29:09.946713 kubelet[2900]: E0123 17:29:09.946516 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:29:10.928000 systemd[1]: Started sshd@9-10.0.1.37:22-4.153.228.146:52578.service - OpenSSH per-connection server daemon (4.153.228.146:52578). Jan 23 17:29:10.926000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.1.37:22-4.153.228.146:52578 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:10.931346 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 17:29:10.931485 kernel: audit: type=1130 audit(1769189350.926:757): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.1.37:22-4.153.228.146:52578 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:11.441000 audit[5343]: USER_ACCT pid=5343 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:11.443189 sshd[5343]: Accepted publickey for core from 4.153.228.146 port 52578 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:29:11.444000 audit[5343]: CRED_ACQ pid=5343 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:11.446815 sshd-session[5343]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:29:11.448692 kernel: audit: type=1101 audit(1769189351.441:758): pid=5343 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:11.448761 kernel: audit: type=1103 audit(1769189351.444:759): pid=5343 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:11.450381 kernel: audit: type=1006 audit(1769189351.444:760): pid=5343 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 23 17:29:11.444000 audit[5343]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe2db4d80 a2=3 a3=0 items=0 ppid=1 pid=5343 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:11.453422 kernel: audit: type=1300 audit(1769189351.444:760): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe2db4d80 a2=3 a3=0 items=0 ppid=1 pid=5343 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:11.453476 kernel: audit: type=1327 audit(1769189351.444:760): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:11.444000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:11.456186 systemd-logind[1644]: New session 11 of user core. Jan 23 17:29:11.465187 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 23 17:29:11.466000 audit[5343]: USER_START pid=5343 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:11.472607 kernel: audit: type=1105 audit(1769189351.466:761): pid=5343 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:11.472691 kernel: audit: type=1103 audit(1769189351.470:762): pid=5347 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:11.470000 audit[5347]: CRED_ACQ pid=5347 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:11.802019 sshd[5347]: Connection closed by 4.153.228.146 port 52578 Jan 23 17:29:11.802486 sshd-session[5343]: pam_unix(sshd:session): session closed for user core Jan 23 17:29:11.802000 audit[5343]: USER_END pid=5343 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:11.806817 systemd-logind[1644]: Session 11 logged out. Waiting for processes to exit. Jan 23 17:29:11.807565 systemd[1]: sshd@9-10.0.1.37:22-4.153.228.146:52578.service: Deactivated successfully. Jan 23 17:29:11.803000 audit[5343]: CRED_DISP pid=5343 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:11.810024 systemd[1]: session-11.scope: Deactivated successfully. Jan 23 17:29:11.811061 kernel: audit: type=1106 audit(1769189351.802:763): pid=5343 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:11.811127 kernel: audit: type=1104 audit(1769189351.803:764): pid=5343 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:11.803000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.1.37:22-4.153.228.146:52578 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:11.812498 systemd-logind[1644]: Removed session 11. Jan 23 17:29:11.915437 systemd[1]: Started sshd@10-10.0.1.37:22-4.153.228.146:52586.service - OpenSSH per-connection server daemon (4.153.228.146:52586). Jan 23 17:29:11.914000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.1.37:22-4.153.228.146:52586 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:12.460000 audit[5362]: USER_ACCT pid=5362 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:12.462536 sshd[5362]: Accepted publickey for core from 4.153.228.146 port 52586 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:29:12.462000 audit[5362]: CRED_ACQ pid=5362 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:12.462000 audit[5362]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd00bee30 a2=3 a3=0 items=0 ppid=1 pid=5362 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:12.462000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:12.464847 sshd-session[5362]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:29:12.468917 systemd-logind[1644]: New session 12 of user core. Jan 23 17:29:12.477538 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 23 17:29:12.478000 audit[5362]: USER_START pid=5362 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:12.481000 audit[5366]: CRED_ACQ pid=5366 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:12.562181 kubelet[2900]: E0123 17:29:12.562130 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:29:12.887522 sshd[5366]: Connection closed by 4.153.228.146 port 52586 Jan 23 17:29:12.888172 sshd-session[5362]: pam_unix(sshd:session): session closed for user core Jan 23 17:29:12.891000 audit[5362]: USER_END pid=5362 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:12.891000 audit[5362]: CRED_DISP pid=5362 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:12.898348 systemd-logind[1644]: Session 12 logged out. Waiting for processes to exit. Jan 23 17:29:12.900000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.1.37:22-4.153.228.146:52586 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:12.901645 systemd[1]: sshd@10-10.0.1.37:22-4.153.228.146:52586.service: Deactivated successfully. Jan 23 17:29:12.904133 systemd[1]: session-12.scope: Deactivated successfully. Jan 23 17:29:12.913204 systemd-logind[1644]: Removed session 12. Jan 23 17:29:12.999165 systemd[1]: Started sshd@11-10.0.1.37:22-4.153.228.146:52602.service - OpenSSH per-connection server daemon (4.153.228.146:52602). Jan 23 17:29:12.998000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.1.37:22-4.153.228.146:52602 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:13.538000 audit[5378]: USER_ACCT pid=5378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:13.539531 sshd[5378]: Accepted publickey for core from 4.153.228.146 port 52602 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:29:13.538000 audit[5378]: CRED_ACQ pid=5378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:13.539000 audit[5378]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc8ee5980 a2=3 a3=0 items=0 ppid=1 pid=5378 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:13.539000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:13.541051 sshd-session[5378]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:29:13.544983 systemd-logind[1644]: New session 13 of user core. Jan 23 17:29:13.554646 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 23 17:29:13.556000 audit[5378]: USER_START pid=5378 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:13.557000 audit[5382]: CRED_ACQ pid=5382 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:13.898715 sshd[5382]: Connection closed by 4.153.228.146 port 52602 Jan 23 17:29:13.899245 sshd-session[5378]: pam_unix(sshd:session): session closed for user core Jan 23 17:29:13.899000 audit[5378]: USER_END pid=5378 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:13.899000 audit[5378]: CRED_DISP pid=5378 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:13.903664 systemd[1]: sshd@11-10.0.1.37:22-4.153.228.146:52602.service: Deactivated successfully. Jan 23 17:29:13.902000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.1.37:22-4.153.228.146:52602 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:13.905475 systemd[1]: session-13.scope: Deactivated successfully. Jan 23 17:29:13.906215 systemd-logind[1644]: Session 13 logged out. Waiting for processes to exit. Jan 23 17:29:13.907359 systemd-logind[1644]: Removed session 13. Jan 23 17:29:17.561860 kubelet[2900]: E0123 17:29:17.561591 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:29:17.561860 kubelet[2900]: E0123 17:29:17.561786 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:29:17.562827 kubelet[2900]: E0123 17:29:17.562583 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:29:19.012206 systemd[1]: Started sshd@12-10.0.1.37:22-4.153.228.146:56970.service - OpenSSH per-connection server daemon (4.153.228.146:56970). Jan 23 17:29:19.010000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.1.37:22-4.153.228.146:56970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:19.015355 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 23 17:29:19.015451 kernel: audit: type=1130 audit(1769189359.010:784): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.1.37:22-4.153.228.146:56970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:19.535000 audit[5399]: USER_ACCT pid=5399 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:19.537019 sshd[5399]: Accepted publickey for core from 4.153.228.146 port 56970 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:29:19.539562 sshd-session[5399]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:29:19.537000 audit[5399]: CRED_ACQ pid=5399 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:19.542349 kernel: audit: type=1101 audit(1769189359.535:785): pid=5399 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:19.542475 kernel: audit: type=1103 audit(1769189359.537:786): pid=5399 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:19.544146 kernel: audit: type=1006 audit(1769189359.537:787): pid=5399 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Jan 23 17:29:19.537000 audit[5399]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff532dd30 a2=3 a3=0 items=0 ppid=1 pid=5399 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:19.545194 systemd-logind[1644]: New session 14 of user core. Jan 23 17:29:19.547226 kernel: audit: type=1300 audit(1769189359.537:787): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff532dd30 a2=3 a3=0 items=0 ppid=1 pid=5399 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:19.547280 kernel: audit: type=1327 audit(1769189359.537:787): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:19.537000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:19.555560 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 23 17:29:19.557000 audit[5399]: USER_START pid=5399 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:19.559000 audit[5405]: CRED_ACQ pid=5405 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:19.564654 kernel: audit: type=1105 audit(1769189359.557:788): pid=5399 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:19.564953 kernel: audit: type=1103 audit(1769189359.559:789): pid=5405 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:19.922654 sshd[5405]: Connection closed by 4.153.228.146 port 56970 Jan 23 17:29:19.923201 sshd-session[5399]: pam_unix(sshd:session): session closed for user core Jan 23 17:29:19.923000 audit[5399]: USER_END pid=5399 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:19.927367 systemd[1]: sshd@12-10.0.1.37:22-4.153.228.146:56970.service: Deactivated successfully. Jan 23 17:29:19.923000 audit[5399]: CRED_DISP pid=5399 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:19.929104 systemd[1]: session-14.scope: Deactivated successfully. Jan 23 17:29:19.930411 systemd-logind[1644]: Session 14 logged out. Waiting for processes to exit. Jan 23 17:29:19.930644 kernel: audit: type=1106 audit(1769189359.923:790): pid=5399 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:19.930881 kernel: audit: type=1104 audit(1769189359.923:791): pid=5399 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:19.926000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.1.37:22-4.153.228.146:56970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:19.931492 systemd-logind[1644]: Removed session 14. Jan 23 17:29:23.561976 kubelet[2900]: E0123 17:29:23.561926 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68b765b4fc-4fz2b" podUID="348a8202-26ae-4e13-80ce-0b3caa537034" Jan 23 17:29:24.563086 kubelet[2900]: E0123 17:29:24.563032 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:29:25.033892 systemd[1]: Started sshd@13-10.0.1.37:22-4.153.228.146:58516.service - OpenSSH per-connection server daemon (4.153.228.146:58516). Jan 23 17:29:25.032000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.1.37:22-4.153.228.146:58516 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:25.037265 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 17:29:25.037362 kernel: audit: type=1130 audit(1769189365.032:793): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.1.37:22-4.153.228.146:58516 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:25.558000 audit[5418]: USER_ACCT pid=5418 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:25.560321 sshd[5418]: Accepted publickey for core from 4.153.228.146 port 58516 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:29:25.563341 kernel: audit: type=1101 audit(1769189365.558:794): pid=5418 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:25.563419 kernel: audit: type=1103 audit(1769189365.561:795): pid=5418 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:25.561000 audit[5418]: CRED_ACQ pid=5418 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:25.564101 sshd-session[5418]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:29:25.567825 kernel: audit: type=1006 audit(1769189365.562:796): pid=5418 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Jan 23 17:29:25.562000 audit[5418]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe04888f0 a2=3 a3=0 items=0 ppid=1 pid=5418 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:25.570941 kernel: audit: type=1300 audit(1769189365.562:796): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe04888f0 a2=3 a3=0 items=0 ppid=1 pid=5418 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:25.562000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:25.572271 kernel: audit: type=1327 audit(1769189365.562:796): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:25.573702 systemd-logind[1644]: New session 15 of user core. Jan 23 17:29:25.581496 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 23 17:29:25.582000 audit[5418]: USER_START pid=5418 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:25.586000 audit[5422]: CRED_ACQ pid=5422 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:25.590807 kernel: audit: type=1105 audit(1769189365.582:797): pid=5418 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:25.590923 kernel: audit: type=1103 audit(1769189365.586:798): pid=5422 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:25.925894 sshd[5422]: Connection closed by 4.153.228.146 port 58516 Jan 23 17:29:25.925804 sshd-session[5418]: pam_unix(sshd:session): session closed for user core Jan 23 17:29:25.926000 audit[5418]: USER_END pid=5418 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:25.931491 systemd[1]: sshd@13-10.0.1.37:22-4.153.228.146:58516.service: Deactivated successfully. Jan 23 17:29:25.926000 audit[5418]: CRED_DISP pid=5418 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:25.935378 kernel: audit: type=1106 audit(1769189365.926:799): pid=5418 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:25.935452 kernel: audit: type=1104 audit(1769189365.926:800): pid=5418 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:25.934223 systemd[1]: session-15.scope: Deactivated successfully. Jan 23 17:29:25.930000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.1.37:22-4.153.228.146:58516 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:25.936610 systemd-logind[1644]: Session 15 logged out. Waiting for processes to exit. Jan 23 17:29:25.938119 systemd-logind[1644]: Removed session 15. Jan 23 17:29:27.561654 kubelet[2900]: E0123 17:29:27.561598 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:29:28.568975 kubelet[2900]: E0123 17:29:28.566981 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:29:31.029695 systemd[1]: Started sshd@14-10.0.1.37:22-4.153.228.146:58520.service - OpenSSH per-connection server daemon (4.153.228.146:58520). Jan 23 17:29:31.028000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.1.37:22-4.153.228.146:58520 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:31.032318 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 17:29:31.032396 kernel: audit: type=1130 audit(1769189371.028:802): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.1.37:22-4.153.228.146:58520 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:31.549000 audit[5460]: USER_ACCT pid=5460 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:31.551153 sshd[5460]: Accepted publickey for core from 4.153.228.146 port 58520 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:29:31.553674 sshd-session[5460]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:29:31.551000 audit[5460]: CRED_ACQ pid=5460 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:31.557440 kernel: audit: type=1101 audit(1769189371.549:803): pid=5460 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:31.557505 kernel: audit: type=1103 audit(1769189371.551:804): pid=5460 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:31.557533 kernel: audit: type=1006 audit(1769189371.551:805): pid=5460 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Jan 23 17:29:31.551000 audit[5460]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd77a8fa0 a2=3 a3=0 items=0 ppid=1 pid=5460 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:31.562800 kernel: audit: type=1300 audit(1769189371.551:805): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd77a8fa0 a2=3 a3=0 items=0 ppid=1 pid=5460 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:31.563992 kubelet[2900]: E0123 17:29:31.563897 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:29:31.565390 kernel: audit: type=1327 audit(1769189371.551:805): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:31.551000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:31.566562 systemd-logind[1644]: New session 16 of user core. Jan 23 17:29:31.575727 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 23 17:29:31.577000 audit[5460]: USER_START pid=5460 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:31.578000 audit[5464]: CRED_ACQ pid=5464 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:31.585030 kernel: audit: type=1105 audit(1769189371.577:806): pid=5460 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:31.585083 kernel: audit: type=1103 audit(1769189371.578:807): pid=5464 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:31.914330 sshd[5464]: Connection closed by 4.153.228.146 port 58520 Jan 23 17:29:31.914538 sshd-session[5460]: pam_unix(sshd:session): session closed for user core Jan 23 17:29:31.915000 audit[5460]: USER_END pid=5460 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:31.919485 systemd[1]: sshd@14-10.0.1.37:22-4.153.228.146:58520.service: Deactivated successfully. Jan 23 17:29:31.915000 audit[5460]: CRED_DISP pid=5460 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:31.921319 kernel: audit: type=1106 audit(1769189371.915:808): pid=5460 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:31.921587 systemd[1]: session-16.scope: Deactivated successfully. Jan 23 17:29:31.917000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.1.37:22-4.153.228.146:58520 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:31.925373 kernel: audit: type=1104 audit(1769189371.915:809): pid=5460 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:31.924937 systemd-logind[1644]: Session 16 logged out. Waiting for processes to exit. Jan 23 17:29:31.926232 systemd-logind[1644]: Removed session 16. Jan 23 17:29:32.021000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.1.37:22-4.153.228.146:58532 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:32.022287 systemd[1]: Started sshd@15-10.0.1.37:22-4.153.228.146:58532.service - OpenSSH per-connection server daemon (4.153.228.146:58532). Jan 23 17:29:32.086909 systemd[1]: Started sshd@16-10.0.1.37:22-103.203.57.2:33574.service - OpenSSH per-connection server daemon (103.203.57.2:33574). Jan 23 17:29:32.085000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.1.37:22-103.203.57.2:33574 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:32.557000 audit[5477]: USER_ACCT pid=5477 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:32.558615 sshd[5477]: Accepted publickey for core from 4.153.228.146 port 58532 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:29:32.559000 audit[5477]: CRED_ACQ pid=5477 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:32.559000 audit[5477]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff2f60b30 a2=3 a3=0 items=0 ppid=1 pid=5477 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:32.559000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:32.561127 sshd-session[5477]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:29:32.562337 kubelet[2900]: E0123 17:29:32.562283 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:29:32.565701 systemd-logind[1644]: New session 17 of user core. Jan 23 17:29:32.570612 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 23 17:29:32.572000 audit[5477]: USER_START pid=5477 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:32.574000 audit[5485]: CRED_ACQ pid=5485 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:32.964000 sshd[5485]: Connection closed by 4.153.228.146 port 58532 Jan 23 17:29:32.964696 sshd-session[5477]: pam_unix(sshd:session): session closed for user core Jan 23 17:29:32.966000 audit[5477]: USER_END pid=5477 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:32.966000 audit[5477]: CRED_DISP pid=5477 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:32.971425 systemd-logind[1644]: Session 17 logged out. Waiting for processes to exit. Jan 23 17:29:32.972474 systemd[1]: sshd@15-10.0.1.37:22-4.153.228.146:58532.service: Deactivated successfully. Jan 23 17:29:32.971000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.1.37:22-4.153.228.146:58532 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:32.975182 systemd[1]: session-17.scope: Deactivated successfully. Jan 23 17:29:32.977363 systemd-logind[1644]: Removed session 17. Jan 23 17:29:33.070000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.1.37:22-4.153.228.146:58536 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:33.072064 systemd[1]: Started sshd@17-10.0.1.37:22-4.153.228.146:58536.service - OpenSSH per-connection server daemon (4.153.228.146:58536). Jan 23 17:29:33.598000 audit[5496]: USER_ACCT pid=5496 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:33.599857 sshd[5496]: Accepted publickey for core from 4.153.228.146 port 58536 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:29:33.601000 audit[5496]: CRED_ACQ pid=5496 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:33.602000 audit[5496]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc5c1fbf0 a2=3 a3=0 items=0 ppid=1 pid=5496 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:33.602000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:33.604564 sshd-session[5496]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:29:33.611635 systemd-logind[1644]: New session 18 of user core. Jan 23 17:29:33.622510 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 23 17:29:33.624000 audit[5496]: USER_START pid=5496 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:33.627000 audit[5500]: CRED_ACQ pid=5500 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:34.442000 audit[5512]: NETFILTER_CFG table=filter:140 family=2 entries=26 op=nft_register_rule pid=5512 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:29:34.442000 audit[5512]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=fffff29daac0 a2=0 a3=1 items=0 ppid=3013 pid=5512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:34.442000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:29:34.457000 audit[5512]: NETFILTER_CFG table=nat:141 family=2 entries=20 op=nft_register_rule pid=5512 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:29:34.457000 audit[5512]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=fffff29daac0 a2=0 a3=1 items=0 ppid=3013 pid=5512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:34.457000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:29:34.506765 sshd[5500]: Connection closed by 4.153.228.146 port 58536 Jan 23 17:29:34.507133 sshd-session[5496]: pam_unix(sshd:session): session closed for user core Jan 23 17:29:34.507000 audit[5496]: USER_END pid=5496 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:34.507000 audit[5496]: CRED_DISP pid=5496 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:34.511502 systemd[1]: sshd@17-10.0.1.37:22-4.153.228.146:58536.service: Deactivated successfully. Jan 23 17:29:34.510000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.1.37:22-4.153.228.146:58536 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:34.514155 systemd[1]: session-18.scope: Deactivated successfully. Jan 23 17:29:34.515019 systemd-logind[1644]: Session 18 logged out. Waiting for processes to exit. Jan 23 17:29:34.516083 systemd-logind[1644]: Removed session 18. Jan 23 17:29:34.563702 kubelet[2900]: E0123 17:29:34.562916 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68b765b4fc-4fz2b" podUID="348a8202-26ae-4e13-80ce-0b3caa537034" Jan 23 17:29:34.620623 systemd[1]: Started sshd@18-10.0.1.37:22-4.153.228.146:59468.service - OpenSSH per-connection server daemon (4.153.228.146:59468). Jan 23 17:29:34.619000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.1.37:22-4.153.228.146:59468 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:35.151000 audit[5517]: USER_ACCT pid=5517 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:35.152580 sshd[5517]: Accepted publickey for core from 4.153.228.146 port 59468 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:29:35.152000 audit[5517]: CRED_ACQ pid=5517 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:35.152000 audit[5517]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff77173e0 a2=3 a3=0 items=0 ppid=1 pid=5517 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:35.152000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:35.154182 sshd-session[5517]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:29:35.158253 systemd-logind[1644]: New session 19 of user core. Jan 23 17:29:35.163814 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 23 17:29:35.165000 audit[5517]: USER_START pid=5517 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:35.166000 audit[5521]: CRED_ACQ pid=5521 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:35.481000 audit[5529]: NETFILTER_CFG table=filter:142 family=2 entries=38 op=nft_register_rule pid=5529 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:29:35.481000 audit[5529]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=fffff7e2aab0 a2=0 a3=1 items=0 ppid=3013 pid=5529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:35.481000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:29:35.488000 audit[5529]: NETFILTER_CFG table=nat:143 family=2 entries=20 op=nft_register_rule pid=5529 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:29:35.488000 audit[5529]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=fffff7e2aab0 a2=0 a3=1 items=0 ppid=3013 pid=5529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:35.488000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:29:35.618442 sshd[5521]: Connection closed by 4.153.228.146 port 59468 Jan 23 17:29:35.618718 sshd-session[5517]: pam_unix(sshd:session): session closed for user core Jan 23 17:29:35.619000 audit[5517]: USER_END pid=5517 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:35.619000 audit[5517]: CRED_DISP pid=5517 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:35.625635 systemd[1]: sshd@18-10.0.1.37:22-4.153.228.146:59468.service: Deactivated successfully. Jan 23 17:29:35.624000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.1.37:22-4.153.228.146:59468 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:35.627550 systemd[1]: session-19.scope: Deactivated successfully. Jan 23 17:29:35.628427 systemd-logind[1644]: Session 19 logged out. Waiting for processes to exit. Jan 23 17:29:35.629707 systemd-logind[1644]: Removed session 19. Jan 23 17:29:35.725000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.1.37:22-4.153.228.146:59472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:35.726605 systemd[1]: Started sshd@19-10.0.1.37:22-4.153.228.146:59472.service - OpenSSH per-connection server daemon (4.153.228.146:59472). Jan 23 17:29:36.261000 audit[5534]: USER_ACCT pid=5534 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:36.263289 sshd[5534]: Accepted publickey for core from 4.153.228.146 port 59472 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:29:36.263666 kernel: kauditd_printk_skb: 48 callbacks suppressed Jan 23 17:29:36.263701 kernel: audit: type=1101 audit(1769189376.261:844): pid=5534 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:36.264000 audit[5534]: CRED_ACQ pid=5534 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:36.266588 sshd-session[5534]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:29:36.269127 kernel: audit: type=1103 audit(1769189376.264:845): pid=5534 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:36.270920 kernel: audit: type=1006 audit(1769189376.264:846): pid=5534 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Jan 23 17:29:36.270981 kernel: audit: type=1300 audit(1769189376.264:846): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd8781740 a2=3 a3=0 items=0 ppid=1 pid=5534 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:36.264000 audit[5534]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd8781740 a2=3 a3=0 items=0 ppid=1 pid=5534 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:36.264000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:36.274988 kernel: audit: type=1327 audit(1769189376.264:846): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:36.275216 systemd-logind[1644]: New session 20 of user core. Jan 23 17:29:36.284672 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 23 17:29:36.286000 audit[5534]: USER_START pid=5534 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:36.290000 audit[5538]: CRED_ACQ pid=5538 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:36.294422 kernel: audit: type=1105 audit(1769189376.286:847): pid=5534 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:36.294526 kernel: audit: type=1103 audit(1769189376.290:848): pid=5538 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:36.618498 sshd[5538]: Connection closed by 4.153.228.146 port 59472 Jan 23 17:29:36.619430 sshd-session[5534]: pam_unix(sshd:session): session closed for user core Jan 23 17:29:36.620000 audit[5534]: USER_END pid=5534 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:36.624821 systemd[1]: sshd@19-10.0.1.37:22-4.153.228.146:59472.service: Deactivated successfully. Jan 23 17:29:36.620000 audit[5534]: CRED_DISP pid=5534 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:36.627152 systemd[1]: session-20.scope: Deactivated successfully. Jan 23 17:29:36.627931 kernel: audit: type=1106 audit(1769189376.620:849): pid=5534 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:36.627999 kernel: audit: type=1104 audit(1769189376.620:850): pid=5534 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:36.628019 kernel: audit: type=1131 audit(1769189376.623:851): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.1.37:22-4.153.228.146:59472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:36.623000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.1.37:22-4.153.228.146:59472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:36.629720 systemd-logind[1644]: Session 20 logged out. Waiting for processes to exit. Jan 23 17:29:36.630987 systemd-logind[1644]: Removed session 20. Jan 23 17:29:37.561417 kubelet[2900]: E0123 17:29:37.561273 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:29:38.457000 audit[5551]: NETFILTER_CFG table=filter:144 family=2 entries=26 op=nft_register_rule pid=5551 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:29:38.457000 audit[5551]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff40f2230 a2=0 a3=1 items=0 ppid=3013 pid=5551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:38.457000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:29:38.469000 audit[5551]: NETFILTER_CFG table=nat:145 family=2 entries=104 op=nft_register_chain pid=5551 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 23 17:29:38.469000 audit[5551]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=48684 a0=3 a1=fffff40f2230 a2=0 a3=1 items=0 ppid=3013 pid=5551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:38.469000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 23 17:29:39.563827 kubelet[2900]: E0123 17:29:39.562614 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:29:41.730000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.1.37:22-4.153.228.146:59486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:41.732043 systemd[1]: Started sshd@20-10.0.1.37:22-4.153.228.146:59486.service - OpenSSH per-connection server daemon (4.153.228.146:59486). Jan 23 17:29:41.733116 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 23 17:29:41.733163 kernel: audit: type=1130 audit(1769189381.730:854): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.1.37:22-4.153.228.146:59486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:42.074674 sshd[5481]: Connection closed by 103.203.57.2 port 33574 [preauth] Jan 23 17:29:42.076891 systemd[1]: sshd@16-10.0.1.37:22-103.203.57.2:33574.service: Deactivated successfully. Jan 23 17:29:42.076000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.1.37:22-103.203.57.2:33574 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:42.081376 kernel: audit: type=1131 audit(1769189382.076:855): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.1.37:22-103.203.57.2:33574 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:42.261771 sshd[5553]: Accepted publickey for core from 4.153.228.146 port 59486 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:29:42.260000 audit[5553]: USER_ACCT pid=5553 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:42.267008 sshd-session[5553]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:29:42.263000 audit[5553]: CRED_ACQ pid=5553 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:42.270723 kernel: audit: type=1101 audit(1769189382.260:856): pid=5553 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:42.270794 kernel: audit: type=1103 audit(1769189382.263:857): pid=5553 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:42.270813 kernel: audit: type=1006 audit(1769189382.263:858): pid=5553 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Jan 23 17:29:42.272327 kernel: audit: type=1300 audit(1769189382.263:858): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffecf254d0 a2=3 a3=0 items=0 ppid=1 pid=5553 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:42.263000 audit[5553]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffecf254d0 a2=3 a3=0 items=0 ppid=1 pid=5553 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:42.275141 kernel: audit: type=1327 audit(1769189382.263:858): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:42.263000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:42.280133 systemd-logind[1644]: New session 21 of user core. Jan 23 17:29:42.292500 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 23 17:29:42.293000 audit[5553]: USER_START pid=5553 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:42.295000 audit[5559]: CRED_ACQ pid=5559 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:42.300948 kernel: audit: type=1105 audit(1769189382.293:859): pid=5553 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:42.301011 kernel: audit: type=1103 audit(1769189382.295:860): pid=5559 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:42.564287 kubelet[2900]: E0123 17:29:42.564075 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:29:42.615215 sshd[5559]: Connection closed by 4.153.228.146 port 59486 Jan 23 17:29:42.615721 sshd-session[5553]: pam_unix(sshd:session): session closed for user core Jan 23 17:29:42.616000 audit[5553]: USER_END pid=5553 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:42.620189 systemd[1]: sshd@20-10.0.1.37:22-4.153.228.146:59486.service: Deactivated successfully. Jan 23 17:29:42.616000 audit[5553]: CRED_DISP pid=5553 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:42.621504 kernel: audit: type=1106 audit(1769189382.616:861): pid=5553 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:42.620000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.1.37:22-4.153.228.146:59486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:42.624059 systemd[1]: session-21.scope: Deactivated successfully. Jan 23 17:29:42.625926 systemd-logind[1644]: Session 21 logged out. Waiting for processes to exit. Jan 23 17:29:42.626895 systemd-logind[1644]: Removed session 21. Jan 23 17:29:43.562207 kubelet[2900]: E0123 17:29:43.562161 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:29:44.561570 kubelet[2900]: E0123 17:29:44.561515 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:29:46.562162 kubelet[2900]: E0123 17:29:46.562110 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68b765b4fc-4fz2b" podUID="348a8202-26ae-4e13-80ce-0b3caa537034" Jan 23 17:29:47.730025 systemd[1]: Started sshd@21-10.0.1.37:22-4.153.228.146:48816.service - OpenSSH per-connection server daemon (4.153.228.146:48816). Jan 23 17:29:47.729000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.1.37:22-4.153.228.146:48816 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:47.733462 kernel: kauditd_printk_skb: 2 callbacks suppressed Jan 23 17:29:47.733530 kernel: audit: type=1130 audit(1769189387.729:864): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.1.37:22-4.153.228.146:48816 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:48.272000 audit[5574]: USER_ACCT pid=5574 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:48.277551 kernel: audit: type=1101 audit(1769189388.272:865): pid=5574 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:48.277629 sshd[5574]: Accepted publickey for core from 4.153.228.146 port 48816 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:29:48.276000 audit[5574]: CRED_ACQ pid=5574 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:48.278717 sshd-session[5574]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:29:48.282229 kernel: audit: type=1103 audit(1769189388.276:866): pid=5574 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:48.282297 kernel: audit: type=1006 audit(1769189388.276:867): pid=5574 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Jan 23 17:29:48.276000 audit[5574]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffce5f5ea0 a2=3 a3=0 items=0 ppid=1 pid=5574 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:48.284711 systemd-logind[1644]: New session 22 of user core. Jan 23 17:29:48.285375 kernel: audit: type=1300 audit(1769189388.276:867): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffce5f5ea0 a2=3 a3=0 items=0 ppid=1 pid=5574 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:48.285411 kernel: audit: type=1327 audit(1769189388.276:867): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:48.276000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:48.290552 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 23 17:29:48.293000 audit[5574]: USER_START pid=5574 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:48.296000 audit[5578]: CRED_ACQ pid=5578 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:48.300509 kernel: audit: type=1105 audit(1769189388.293:868): pid=5574 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:48.300658 kernel: audit: type=1103 audit(1769189388.296:869): pid=5578 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:48.668512 sshd[5578]: Connection closed by 4.153.228.146 port 48816 Jan 23 17:29:48.668804 sshd-session[5574]: pam_unix(sshd:session): session closed for user core Jan 23 17:29:48.669000 audit[5574]: USER_END pid=5574 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:48.673978 systemd[1]: sshd@21-10.0.1.37:22-4.153.228.146:48816.service: Deactivated successfully. Jan 23 17:29:48.670000 audit[5574]: CRED_DISP pid=5574 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:48.676012 systemd[1]: session-22.scope: Deactivated successfully. Jan 23 17:29:48.679325 kernel: audit: type=1106 audit(1769189388.669:870): pid=5574 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:48.679416 kernel: audit: type=1104 audit(1769189388.670:871): pid=5574 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:48.672000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.1.37:22-4.153.228.146:48816 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:48.677716 systemd-logind[1644]: Session 22 logged out. Waiting for processes to exit. Jan 23 17:29:48.679882 systemd-logind[1644]: Removed session 22. Jan 23 17:29:51.561236 kubelet[2900]: E0123 17:29:51.561167 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:29:51.562285 kubelet[2900]: E0123 17:29:51.561634 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:29:53.779231 systemd[1]: Started sshd@22-10.0.1.37:22-4.153.228.146:48826.service - OpenSSH per-connection server daemon (4.153.228.146:48826). Jan 23 17:29:53.778000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.1.37:22-4.153.228.146:48826 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:53.780467 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 17:29:53.780504 kernel: audit: type=1130 audit(1769189393.778:873): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.1.37:22-4.153.228.146:48826 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:54.304000 audit[5594]: USER_ACCT pid=5594 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:54.305709 sshd[5594]: Accepted publickey for core from 4.153.228.146 port 48826 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:29:54.307000 audit[5594]: CRED_ACQ pid=5594 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:54.309407 sshd-session[5594]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:29:54.311278 kernel: audit: type=1101 audit(1769189394.304:874): pid=5594 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:54.311375 kernel: audit: type=1103 audit(1769189394.307:875): pid=5594 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:54.311393 kernel: audit: type=1006 audit(1769189394.307:876): pid=5594 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Jan 23 17:29:54.307000 audit[5594]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc3c01dc0 a2=3 a3=0 items=0 ppid=1 pid=5594 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:54.316044 kernel: audit: type=1300 audit(1769189394.307:876): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc3c01dc0 a2=3 a3=0 items=0 ppid=1 pid=5594 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:29:54.316149 kernel: audit: type=1327 audit(1769189394.307:876): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:54.307000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:29:54.322534 systemd-logind[1644]: New session 23 of user core. Jan 23 17:29:54.332541 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 23 17:29:54.333000 audit[5594]: USER_START pid=5594 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:54.337000 audit[5598]: CRED_ACQ pid=5598 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:54.341815 kernel: audit: type=1105 audit(1769189394.333:877): pid=5594 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:54.341895 kernel: audit: type=1103 audit(1769189394.337:878): pid=5598 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:54.696392 sshd[5598]: Connection closed by 4.153.228.146 port 48826 Jan 23 17:29:54.697151 sshd-session[5594]: pam_unix(sshd:session): session closed for user core Jan 23 17:29:54.698000 audit[5594]: USER_END pid=5594 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:54.703402 systemd[1]: sshd@22-10.0.1.37:22-4.153.228.146:48826.service: Deactivated successfully. Jan 23 17:29:54.698000 audit[5594]: CRED_DISP pid=5594 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:54.706141 kernel: audit: type=1106 audit(1769189394.698:879): pid=5594 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:54.706209 kernel: audit: type=1104 audit(1769189394.698:880): pid=5594 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:29:54.707713 systemd[1]: session-23.scope: Deactivated successfully. Jan 23 17:29:54.704000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.1.37:22-4.153.228.146:48826 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:54.708932 systemd-logind[1644]: Session 23 logged out. Waiting for processes to exit. Jan 23 17:29:54.711898 systemd-logind[1644]: Removed session 23. Jan 23 17:29:55.561159 kubelet[2900]: E0123 17:29:55.561101 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:29:55.561159 kubelet[2900]: E0123 17:29:55.561153 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:29:56.566167 kubelet[2900]: E0123 17:29:56.565998 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:29:59.561717 kubelet[2900]: E0123 17:29:59.561656 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68b765b4fc-4fz2b" podUID="348a8202-26ae-4e13-80ce-0b3caa537034" Jan 23 17:29:59.803000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.1.37:22-4.153.228.146:35922 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:29:59.804960 systemd[1]: Started sshd@23-10.0.1.37:22-4.153.228.146:35922.service - OpenSSH per-connection server daemon (4.153.228.146:35922). Jan 23 17:29:59.805795 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 17:29:59.805835 kernel: audit: type=1130 audit(1769189399.803:882): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.1.37:22-4.153.228.146:35922 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:30:00.337000 audit[5638]: USER_ACCT pid=5638 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:30:00.339508 sshd[5638]: Accepted publickey for core from 4.153.228.146 port 35922 ssh2: RSA SHA256:W4H1ZAj5T8TSA1hLSDGRkaV3wBweA5tg1dlTi453QCw Jan 23 17:30:00.341000 audit[5638]: CRED_ACQ pid=5638 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:30:00.343437 sshd-session[5638]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 23 17:30:00.345342 kernel: audit: type=1101 audit(1769189400.337:883): pid=5638 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:30:00.345407 kernel: audit: type=1103 audit(1769189400.341:884): pid=5638 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:30:00.345428 kernel: audit: type=1006 audit(1769189400.341:885): pid=5638 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Jan 23 17:30:00.341000 audit[5638]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd13fc000 a2=3 a3=0 items=0 ppid=1 pid=5638 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:00.348745 systemd-logind[1644]: New session 24 of user core. Jan 23 17:30:00.350050 kernel: audit: type=1300 audit(1769189400.341:885): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd13fc000 a2=3 a3=0 items=0 ppid=1 pid=5638 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:00.350098 kernel: audit: type=1327 audit(1769189400.341:885): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:30:00.341000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 23 17:30:00.360699 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 23 17:30:00.362000 audit[5638]: USER_START pid=5638 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:30:00.364000 audit[5642]: CRED_ACQ pid=5642 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:30:00.369939 kernel: audit: type=1105 audit(1769189400.362:886): pid=5638 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:30:00.369996 kernel: audit: type=1103 audit(1769189400.364:887): pid=5642 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:30:00.697339 sshd[5642]: Connection closed by 4.153.228.146 port 35922 Jan 23 17:30:00.697545 sshd-session[5638]: pam_unix(sshd:session): session closed for user core Jan 23 17:30:00.697000 audit[5638]: USER_END pid=5638 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:30:00.702864 systemd[1]: sshd@23-10.0.1.37:22-4.153.228.146:35922.service: Deactivated successfully. Jan 23 17:30:00.697000 audit[5638]: CRED_DISP pid=5638 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:30:00.703354 systemd-logind[1644]: Session 24 logged out. Waiting for processes to exit. Jan 23 17:30:00.704782 systemd[1]: session-24.scope: Deactivated successfully. Jan 23 17:30:00.705651 kernel: audit: type=1106 audit(1769189400.697:888): pid=5638 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:30:00.705714 kernel: audit: type=1104 audit(1769189400.697:889): pid=5638 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 23 17:30:00.701000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.1.37:22-4.153.228.146:35922 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 23 17:30:00.706451 systemd-logind[1644]: Removed session 24. Jan 23 17:30:04.564295 kubelet[2900]: E0123 17:30:04.564220 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:30:06.566384 kubelet[2900]: E0123 17:30:06.565759 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:30:09.562321 kubelet[2900]: E0123 17:30:09.561821 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:30:10.562089 kubelet[2900]: E0123 17:30:10.561660 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:30:11.562199 kubelet[2900]: E0123 17:30:11.562128 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:30:13.562133 kubelet[2900]: E0123 17:30:13.561960 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68b765b4fc-4fz2b" podUID="348a8202-26ae-4e13-80ce-0b3caa537034" Jan 23 17:30:15.561684 kubelet[2900]: E0123 17:30:15.561622 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:30:21.561531 kubelet[2900]: E0123 17:30:21.561442 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:30:23.209152 systemd[1]: cri-containerd-64fdb14b02771a41d2c5ca95d13c1a2764265d05f896186cb5140b274582ca06.scope: Deactivated successfully. Jan 23 17:30:23.210754 systemd[1]: cri-containerd-64fdb14b02771a41d2c5ca95d13c1a2764265d05f896186cb5140b274582ca06.scope: Consumed 36.403s CPU time, 114.9M memory peak. Jan 23 17:30:23.212214 containerd[1675]: time="2026-01-23T17:30:23.212097003Z" level=info msg="received container exit event container_id:\"64fdb14b02771a41d2c5ca95d13c1a2764265d05f896186cb5140b274582ca06\" id:\"64fdb14b02771a41d2c5ca95d13c1a2764265d05f896186cb5140b274582ca06\" pid:3229 exit_status:1 exited_at:{seconds:1769189423 nanos:211696081}" Jan 23 17:30:23.216290 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 23 17:30:23.216495 kernel: audit: type=1334 audit(1769189423.214:891): prog-id=146 op=UNLOAD Jan 23 17:30:23.216524 kernel: audit: type=1334 audit(1769189423.214:892): prog-id=150 op=UNLOAD Jan 23 17:30:23.214000 audit: BPF prog-id=146 op=UNLOAD Jan 23 17:30:23.214000 audit: BPF prog-id=150 op=UNLOAD Jan 23 17:30:23.214988 systemd[1]: cri-containerd-5c2e7756de7eab65f7b700c80d989c570101c0ff9cda4fb1c227ce4ccffa573f.scope: Deactivated successfully. Jan 23 17:30:23.215340 systemd[1]: cri-containerd-5c2e7756de7eab65f7b700c80d989c570101c0ff9cda4fb1c227ce4ccffa573f.scope: Consumed 4.383s CPU time, 64.9M memory peak. Jan 23 17:30:23.215000 audit: BPF prog-id=256 op=LOAD Jan 23 17:30:23.217399 containerd[1675]: time="2026-01-23T17:30:23.217324069Z" level=info msg="received container exit event container_id:\"5c2e7756de7eab65f7b700c80d989c570101c0ff9cda4fb1c227ce4ccffa573f\" id:\"5c2e7756de7eab65f7b700c80d989c570101c0ff9cda4fb1c227ce4ccffa573f\" pid:2751 exit_status:1 exited_at:{seconds:1769189423 nanos:217040907}" Jan 23 17:30:23.217816 kernel: audit: type=1334 audit(1769189423.215:893): prog-id=256 op=LOAD Jan 23 17:30:23.217892 kernel: audit: type=1334 audit(1769189423.215:894): prog-id=83 op=UNLOAD Jan 23 17:30:23.215000 audit: BPF prog-id=83 op=UNLOAD Jan 23 17:30:23.220000 audit: BPF prog-id=108 op=UNLOAD Jan 23 17:30:23.220000 audit: BPF prog-id=112 op=UNLOAD Jan 23 17:30:23.222432 kernel: audit: type=1334 audit(1769189423.220:895): prog-id=108 op=UNLOAD Jan 23 17:30:23.222499 kernel: audit: type=1334 audit(1769189423.220:896): prog-id=112 op=UNLOAD Jan 23 17:30:23.234941 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-64fdb14b02771a41d2c5ca95d13c1a2764265d05f896186cb5140b274582ca06-rootfs.mount: Deactivated successfully. Jan 23 17:30:23.242678 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5c2e7756de7eab65f7b700c80d989c570101c0ff9cda4fb1c227ce4ccffa573f-rootfs.mount: Deactivated successfully. Jan 23 17:30:23.463963 kubelet[2900]: E0123 17:30:23.463826 2900 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.1.37:47976->10.0.1.42:2379: read: connection timed out" Jan 23 17:30:23.561295 containerd[1675]: time="2026-01-23T17:30:23.561259596Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 23 17:30:23.905420 containerd[1675]: time="2026-01-23T17:30:23.905199123Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:30:23.906747 containerd[1675]: time="2026-01-23T17:30:23.906676730Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 23 17:30:23.906824 containerd[1675]: time="2026-01-23T17:30:23.906726051Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 23 17:30:23.906978 kubelet[2900]: E0123 17:30:23.906940 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 17:30:23.907026 kubelet[2900]: E0123 17:30:23.906989 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 23 17:30:23.907085 kubelet[2900]: E0123 17:30:23.907067 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-ssq4v_calico-system(4e4a5b57-4234-49d1-b775-45759cc5cd06): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 23 17:30:23.907117 kubelet[2900]: E0123 17:30:23.907101 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:30:24.129721 kubelet[2900]: I0123 17:30:24.129260 2900 scope.go:117] "RemoveContainer" containerID="5c2e7756de7eab65f7b700c80d989c570101c0ff9cda4fb1c227ce4ccffa573f" Jan 23 17:30:24.131633 containerd[1675]: time="2026-01-23T17:30:24.131397473Z" level=info msg="CreateContainer within sandbox \"5cf44d65875e6a2134c83dcee7272447686dec73bdbbb593ee8361d63cc1f971\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 23 17:30:24.132387 kubelet[2900]: I0123 17:30:24.132365 2900 scope.go:117] "RemoveContainer" containerID="64fdb14b02771a41d2c5ca95d13c1a2764265d05f896186cb5140b274582ca06" Jan 23 17:30:24.134823 containerd[1675]: time="2026-01-23T17:30:24.134791450Z" level=info msg="CreateContainer within sandbox \"d5c6646efba872b11ae4414bfaf1e6dae26d443d19e42eb200b28671cec6198d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 23 17:30:24.141623 containerd[1675]: time="2026-01-23T17:30:24.141586043Z" level=info msg="Container 6f84ef6654e77f0895e7b8ff836e47458574f0242623494f2669198f58f7a19c: CDI devices from CRI Config.CDIDevices: []" Jan 23 17:30:24.149605 containerd[1675]: time="2026-01-23T17:30:24.149564722Z" level=info msg="Container f8178449d120fb7547aaec3675f897f8f31d33319699114984209fed95dbff05: CDI devices from CRI Config.CDIDevices: []" Jan 23 17:30:24.153385 containerd[1675]: time="2026-01-23T17:30:24.153346981Z" level=info msg="CreateContainer within sandbox \"5cf44d65875e6a2134c83dcee7272447686dec73bdbbb593ee8361d63cc1f971\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"6f84ef6654e77f0895e7b8ff836e47458574f0242623494f2669198f58f7a19c\"" Jan 23 17:30:24.153942 containerd[1675]: time="2026-01-23T17:30:24.153903103Z" level=info msg="StartContainer for \"6f84ef6654e77f0895e7b8ff836e47458574f0242623494f2669198f58f7a19c\"" Jan 23 17:30:24.155262 containerd[1675]: time="2026-01-23T17:30:24.155197790Z" level=info msg="connecting to shim 6f84ef6654e77f0895e7b8ff836e47458574f0242623494f2669198f58f7a19c" address="unix:///run/containerd/s/f8d16c9df8d4a7eae414ae67a60d4ab77c3fe6a5ce80e3cef06d3b7d0f8efb33" protocol=ttrpc version=3 Jan 23 17:30:24.157009 containerd[1675]: time="2026-01-23T17:30:24.156897478Z" level=info msg="CreateContainer within sandbox \"d5c6646efba872b11ae4414bfaf1e6dae26d443d19e42eb200b28671cec6198d\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"f8178449d120fb7547aaec3675f897f8f31d33319699114984209fed95dbff05\"" Jan 23 17:30:24.157520 containerd[1675]: time="2026-01-23T17:30:24.157469641Z" level=info msg="StartContainer for \"f8178449d120fb7547aaec3675f897f8f31d33319699114984209fed95dbff05\"" Jan 23 17:30:24.158837 containerd[1675]: time="2026-01-23T17:30:24.158704847Z" level=info msg="connecting to shim f8178449d120fb7547aaec3675f897f8f31d33319699114984209fed95dbff05" address="unix:///run/containerd/s/b64269228654a1785b714ce7ecd2180aa61dd27cf3b37a9263ea557170b2fcf3" protocol=ttrpc version=3 Jan 23 17:30:24.180514 systemd[1]: Started cri-containerd-6f84ef6654e77f0895e7b8ff836e47458574f0242623494f2669198f58f7a19c.scope - libcontainer container 6f84ef6654e77f0895e7b8ff836e47458574f0242623494f2669198f58f7a19c. Jan 23 17:30:24.181755 systemd[1]: Started cri-containerd-f8178449d120fb7547aaec3675f897f8f31d33319699114984209fed95dbff05.scope - libcontainer container f8178449d120fb7547aaec3675f897f8f31d33319699114984209fed95dbff05. Jan 23 17:30:24.193000 audit: BPF prog-id=257 op=LOAD Jan 23 17:30:24.195362 kernel: audit: type=1334 audit(1769189424.193:897): prog-id=257 op=LOAD Jan 23 17:30:24.195000 audit: BPF prog-id=258 op=LOAD Jan 23 17:30:24.195000 audit[5692]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400017a180 a2=98 a3=0 items=0 ppid=3083 pid=5692 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:24.200044 kernel: audit: type=1334 audit(1769189424.195:898): prog-id=258 op=LOAD Jan 23 17:30:24.201435 kernel: audit: type=1300 audit(1769189424.195:898): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400017a180 a2=98 a3=0 items=0 ppid=3083 pid=5692 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:24.201484 kernel: audit: type=1327 audit(1769189424.195:898): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638313738343439643132306662373534376161656333363735663839 Jan 23 17:30:24.195000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638313738343439643132306662373534376161656333363735663839 Jan 23 17:30:24.195000 audit: BPF prog-id=258 op=UNLOAD Jan 23 17:30:24.195000 audit[5692]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3083 pid=5692 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:24.195000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638313738343439643132306662373534376161656333363735663839 Jan 23 17:30:24.195000 audit: BPF prog-id=259 op=LOAD Jan 23 17:30:24.195000 audit[5692]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400017a3e8 a2=98 a3=0 items=0 ppid=3083 pid=5692 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:24.195000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638313738343439643132306662373534376161656333363735663839 Jan 23 17:30:24.195000 audit: BPF prog-id=260 op=LOAD Jan 23 17:30:24.195000 audit[5692]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=400017a168 a2=98 a3=0 items=0 ppid=3083 pid=5692 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:24.195000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638313738343439643132306662373534376161656333363735663839 Jan 23 17:30:24.195000 audit: BPF prog-id=260 op=UNLOAD Jan 23 17:30:24.195000 audit[5692]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3083 pid=5692 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:24.195000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638313738343439643132306662373534376161656333363735663839 Jan 23 17:30:24.195000 audit: BPF prog-id=259 op=UNLOAD Jan 23 17:30:24.195000 audit[5692]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3083 pid=5692 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:24.195000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638313738343439643132306662373534376161656333363735663839 Jan 23 17:30:24.195000 audit: BPF prog-id=261 op=LOAD Jan 23 17:30:24.195000 audit[5692]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400017a648 a2=98 a3=0 items=0 ppid=3083 pid=5692 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:24.195000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6638313738343439643132306662373534376161656333363735663839 Jan 23 17:30:24.196000 audit: BPF prog-id=262 op=LOAD Jan 23 17:30:24.199000 audit: BPF prog-id=263 op=LOAD Jan 23 17:30:24.199000 audit[5691]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=2571 pid=5691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:24.199000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666383465663636353465373766303839356537623866663833366534 Jan 23 17:30:24.202000 audit: BPF prog-id=263 op=UNLOAD Jan 23 17:30:24.202000 audit[5691]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2571 pid=5691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:24.202000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666383465663636353465373766303839356537623866663833366534 Jan 23 17:30:24.202000 audit: BPF prog-id=264 op=LOAD Jan 23 17:30:24.202000 audit[5691]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=2571 pid=5691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:24.202000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666383465663636353465373766303839356537623866663833366534 Jan 23 17:30:24.203000 audit: BPF prog-id=265 op=LOAD Jan 23 17:30:24.203000 audit[5691]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=2571 pid=5691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:24.203000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666383465663636353465373766303839356537623866663833366534 Jan 23 17:30:24.203000 audit: BPF prog-id=265 op=UNLOAD Jan 23 17:30:24.203000 audit[5691]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2571 pid=5691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:24.203000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666383465663636353465373766303839356537623866663833366534 Jan 23 17:30:24.203000 audit: BPF prog-id=264 op=UNLOAD Jan 23 17:30:24.203000 audit[5691]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2571 pid=5691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:24.203000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666383465663636353465373766303839356537623866663833366534 Jan 23 17:30:24.203000 audit: BPF prog-id=266 op=LOAD Jan 23 17:30:24.203000 audit[5691]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=2571 pid=5691 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:24.203000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666383465663636353465373766303839356537623866663833366534 Jan 23 17:30:24.224250 containerd[1675]: time="2026-01-23T17:30:24.224212128Z" level=info msg="StartContainer for \"f8178449d120fb7547aaec3675f897f8f31d33319699114984209fed95dbff05\" returns successfully" Jan 23 17:30:24.239498 containerd[1675]: time="2026-01-23T17:30:24.239455123Z" level=info msg="StartContainer for \"6f84ef6654e77f0895e7b8ff836e47458574f0242623494f2669198f58f7a19c\" returns successfully" Jan 23 17:30:24.564452 containerd[1675]: time="2026-01-23T17:30:24.564300836Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 23 17:30:24.902292 containerd[1675]: time="2026-01-23T17:30:24.902199414Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:30:24.903997 containerd[1675]: time="2026-01-23T17:30:24.903943943Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 23 17:30:24.904125 containerd[1675]: time="2026-01-23T17:30:24.904001623Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 23 17:30:24.904215 kubelet[2900]: E0123 17:30:24.904182 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 17:30:24.904688 kubelet[2900]: E0123 17:30:24.904227 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 23 17:30:24.904688 kubelet[2900]: E0123 17:30:24.904390 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5bcfdccd6d-ph9sw_calico-system(d8d63139-4f9a-445f-a5a9-c69b14220343): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 23 17:30:24.904688 kubelet[2900]: E0123 17:30:24.904428 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:30:24.904775 containerd[1675]: time="2026-01-23T17:30:24.904562706Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 23 17:30:25.261928 containerd[1675]: time="2026-01-23T17:30:25.261680818Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:30:25.263455 containerd[1675]: time="2026-01-23T17:30:25.263385186Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 23 17:30:25.263541 containerd[1675]: time="2026-01-23T17:30:25.263487346Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 23 17:30:25.263967 kubelet[2900]: E0123 17:30:25.263702 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 17:30:25.263967 kubelet[2900]: E0123 17:30:25.263749 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 23 17:30:25.263967 kubelet[2900]: E0123 17:30:25.263826 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-68b765b4fc-4fz2b_calico-system(348a8202-26ae-4e13-80ce-0b3caa537034): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 23 17:30:25.266563 containerd[1675]: time="2026-01-23T17:30:25.266473401Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 23 17:30:25.596443 containerd[1675]: time="2026-01-23T17:30:25.596186938Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:30:25.597828 containerd[1675]: time="2026-01-23T17:30:25.597787906Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 23 17:30:25.597923 containerd[1675]: time="2026-01-23T17:30:25.597845107Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 23 17:30:25.598058 kubelet[2900]: E0123 17:30:25.597999 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 17:30:25.598146 kubelet[2900]: E0123 17:30:25.598121 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 23 17:30:25.598285 kubelet[2900]: E0123 17:30:25.598233 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-68b765b4fc-4fz2b_calico-system(348a8202-26ae-4e13-80ce-0b3caa537034): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 23 17:30:25.598350 kubelet[2900]: E0123 17:30:25.598300 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68b765b4fc-4fz2b" podUID="348a8202-26ae-4e13-80ce-0b3caa537034" Jan 23 17:30:26.566113 containerd[1675]: time="2026-01-23T17:30:26.566059856Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 17:30:26.905027 containerd[1675]: time="2026-01-23T17:30:26.904859718Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:30:26.906442 containerd[1675]: time="2026-01-23T17:30:26.906409366Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 17:30:26.906523 containerd[1675]: time="2026-01-23T17:30:26.906486006Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 17:30:26.906699 kubelet[2900]: E0123 17:30:26.906646 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:30:26.906699 kubelet[2900]: E0123 17:30:26.906692 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:30:26.906984 kubelet[2900]: E0123 17:30:26.906909 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-dd7cc585d-gp992_calico-apiserver(bad712f1-634f-4629-aa4d-a4a0636cb622): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 17:30:26.906984 kubelet[2900]: E0123 17:30:26.906944 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:30:26.907178 containerd[1675]: time="2026-01-23T17:30:26.907127729Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 23 17:30:27.234693 containerd[1675]: time="2026-01-23T17:30:27.234495775Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:30:27.236011 containerd[1675]: time="2026-01-23T17:30:27.235975542Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 23 17:30:27.236102 containerd[1675]: time="2026-01-23T17:30:27.236009183Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 23 17:30:27.236198 kubelet[2900]: E0123 17:30:27.236160 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 17:30:27.236249 kubelet[2900]: E0123 17:30:27.236207 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 23 17:30:27.236299 kubelet[2900]: E0123 17:30:27.236277 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-p68bw_calico-system(baa2fc5a-f231-4d23-992e-37e27c865a7c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 23 17:30:27.236980 containerd[1675]: time="2026-01-23T17:30:27.236955867Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 23 17:30:27.569935 containerd[1675]: time="2026-01-23T17:30:27.569710820Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:30:27.571449 containerd[1675]: time="2026-01-23T17:30:27.571367068Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 23 17:30:27.571449 containerd[1675]: time="2026-01-23T17:30:27.571399308Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 23 17:30:27.571748 kubelet[2900]: E0123 17:30:27.571685 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 17:30:27.571859 kubelet[2900]: E0123 17:30:27.571843 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 23 17:30:27.571986 kubelet[2900]: E0123 17:30:27.571968 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-p68bw_calico-system(baa2fc5a-f231-4d23-992e-37e27c865a7c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 23 17:30:27.572187 kubelet[2900]: E0123 17:30:27.572161 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:30:29.142000 audit: BPF prog-id=267 op=LOAD Jan 23 17:30:29.141886 systemd[1]: cri-containerd-32aed78b76544c076f5b266dacdad0c5e108e6f3baf421a20aa3fd075bda8705.scope: Deactivated successfully. Jan 23 17:30:29.142245 systemd[1]: cri-containerd-32aed78b76544c076f5b266dacdad0c5e108e6f3baf421a20aa3fd075bda8705.scope: Consumed 3.508s CPU time, 25.4M memory peak. Jan 23 17:30:29.143453 kernel: kauditd_printk_skb: 40 callbacks suppressed Jan 23 17:30:29.143484 kernel: audit: type=1334 audit(1769189429.142:913): prog-id=267 op=LOAD Jan 23 17:30:29.142000 audit: BPF prog-id=93 op=UNLOAD Jan 23 17:30:29.144802 kernel: audit: type=1334 audit(1769189429.142:914): prog-id=93 op=UNLOAD Jan 23 17:30:29.146068 containerd[1675]: time="2026-01-23T17:30:29.146001632Z" level=info msg="received container exit event container_id:\"32aed78b76544c076f5b266dacdad0c5e108e6f3baf421a20aa3fd075bda8705\" id:\"32aed78b76544c076f5b266dacdad0c5e108e6f3baf421a20aa3fd075bda8705\" pid:2747 exit_status:1 exited_at:{seconds:1769189429 nanos:145632150}" Jan 23 17:30:29.146000 audit: BPF prog-id=103 op=UNLOAD Jan 23 17:30:29.146000 audit: BPF prog-id=107 op=UNLOAD Jan 23 17:30:29.148372 kernel: audit: type=1334 audit(1769189429.146:915): prog-id=103 op=UNLOAD Jan 23 17:30:29.148467 kernel: audit: type=1334 audit(1769189429.146:916): prog-id=107 op=UNLOAD Jan 23 17:30:29.166737 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-32aed78b76544c076f5b266dacdad0c5e108e6f3baf421a20aa3fd075bda8705-rootfs.mount: Deactivated successfully. Jan 23 17:30:30.152486 kubelet[2900]: I0123 17:30:30.152457 2900 scope.go:117] "RemoveContainer" containerID="32aed78b76544c076f5b266dacdad0c5e108e6f3baf421a20aa3fd075bda8705" Jan 23 17:30:30.154111 containerd[1675]: time="2026-01-23T17:30:30.154072097Z" level=info msg="CreateContainer within sandbox \"7e26985d6d93c8207e4710391631ebf09be9da0ac66b5bb1edc81400ecadbae6\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 23 17:30:30.165150 containerd[1675]: time="2026-01-23T17:30:30.163591384Z" level=info msg="Container 5a6adf2ef95b8e0d59fc9175b48f113eaa05f6e2fbeaccd34c89b7685704a5ae: CDI devices from CRI Config.CDIDevices: []" Jan 23 17:30:30.173202 containerd[1675]: time="2026-01-23T17:30:30.173145391Z" level=info msg="CreateContainer within sandbox \"7e26985d6d93c8207e4710391631ebf09be9da0ac66b5bb1edc81400ecadbae6\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"5a6adf2ef95b8e0d59fc9175b48f113eaa05f6e2fbeaccd34c89b7685704a5ae\"" Jan 23 17:30:30.173794 containerd[1675]: time="2026-01-23T17:30:30.173771874Z" level=info msg="StartContainer for \"5a6adf2ef95b8e0d59fc9175b48f113eaa05f6e2fbeaccd34c89b7685704a5ae\"" Jan 23 17:30:30.174830 containerd[1675]: time="2026-01-23T17:30:30.174800999Z" level=info msg="connecting to shim 5a6adf2ef95b8e0d59fc9175b48f113eaa05f6e2fbeaccd34c89b7685704a5ae" address="unix:///run/containerd/s/75146ef1517a783455e5098d3a21642e585a38568855b05f2ee462b15a145d02" protocol=ttrpc version=3 Jan 23 17:30:30.203783 systemd[1]: Started cri-containerd-5a6adf2ef95b8e0d59fc9175b48f113eaa05f6e2fbeaccd34c89b7685704a5ae.scope - libcontainer container 5a6adf2ef95b8e0d59fc9175b48f113eaa05f6e2fbeaccd34c89b7685704a5ae. Jan 23 17:30:30.213000 audit: BPF prog-id=268 op=LOAD Jan 23 17:30:30.214000 audit: BPF prog-id=269 op=LOAD Jan 23 17:30:30.215774 kernel: audit: type=1334 audit(1769189430.213:917): prog-id=268 op=LOAD Jan 23 17:30:30.215882 kernel: audit: type=1334 audit(1769189430.214:918): prog-id=269 op=LOAD Jan 23 17:30:30.215902 kernel: audit: type=1300 audit(1769189430.214:918): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2595 pid=5795 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:30.214000 audit[5795]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2595 pid=5795 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:30.218687 kernel: audit: type=1327 audit(1769189430.214:918): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561366164663265663935623865306435396663393137356234386631 Jan 23 17:30:30.214000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561366164663265663935623865306435396663393137356234386631 Jan 23 17:30:30.214000 audit: BPF prog-id=269 op=UNLOAD Jan 23 17:30:30.214000 audit[5795]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2595 pid=5795 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:30.225014 kernel: audit: type=1334 audit(1769189430.214:919): prog-id=269 op=UNLOAD Jan 23 17:30:30.225067 kernel: audit: type=1300 audit(1769189430.214:919): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2595 pid=5795 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:30.214000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561366164663265663935623865306435396663393137356234386631 Jan 23 17:30:30.214000 audit: BPF prog-id=270 op=LOAD Jan 23 17:30:30.214000 audit[5795]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2595 pid=5795 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:30.214000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561366164663265663935623865306435396663393137356234386631 Jan 23 17:30:30.215000 audit: BPF prog-id=271 op=LOAD Jan 23 17:30:30.215000 audit[5795]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2595 pid=5795 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:30.215000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561366164663265663935623865306435396663393137356234386631 Jan 23 17:30:30.218000 audit: BPF prog-id=271 op=UNLOAD Jan 23 17:30:30.218000 audit[5795]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2595 pid=5795 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:30.218000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561366164663265663935623865306435396663393137356234386631 Jan 23 17:30:30.218000 audit: BPF prog-id=270 op=UNLOAD Jan 23 17:30:30.218000 audit[5795]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2595 pid=5795 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:30.218000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561366164663265663935623865306435396663393137356234386631 Jan 23 17:30:30.218000 audit: BPF prog-id=272 op=LOAD Jan 23 17:30:30.218000 audit[5795]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2595 pid=5795 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 23 17:30:30.218000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561366164663265663935623865306435396663393137356234386631 Jan 23 17:30:30.252619 containerd[1675]: time="2026-01-23T17:30:30.252581100Z" level=info msg="StartContainer for \"5a6adf2ef95b8e0d59fc9175b48f113eaa05f6e2fbeaccd34c89b7685704a5ae\" returns successfully" Jan 23 17:30:33.464678 kubelet[2900]: E0123 17:30:33.464637 2900 controller.go:195] "Failed to update lease" err="Put \"https://10.0.1.37:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-1-0-a-d0877fd079?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 23 17:30:35.419784 systemd[1]: cri-containerd-f8178449d120fb7547aaec3675f897f8f31d33319699114984209fed95dbff05.scope: Deactivated successfully. Jan 23 17:30:35.420829 containerd[1675]: time="2026-01-23T17:30:35.420256411Z" level=info msg="received container exit event container_id:\"f8178449d120fb7547aaec3675f897f8f31d33319699114984209fed95dbff05\" id:\"f8178449d120fb7547aaec3675f897f8f31d33319699114984209fed95dbff05\" pid:5718 exit_status:1 exited_at:{seconds:1769189435 nanos:420024730}" Jan 23 17:30:35.425000 audit: BPF prog-id=257 op=UNLOAD Jan 23 17:30:35.426726 kernel: kauditd_printk_skb: 16 callbacks suppressed Jan 23 17:30:35.426777 kernel: audit: type=1334 audit(1769189435.425:925): prog-id=257 op=UNLOAD Jan 23 17:30:35.426802 kernel: audit: type=1334 audit(1769189435.425:926): prog-id=261 op=UNLOAD Jan 23 17:30:35.425000 audit: BPF prog-id=261 op=UNLOAD Jan 23 17:30:35.440027 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f8178449d120fb7547aaec3675f897f8f31d33319699114984209fed95dbff05-rootfs.mount: Deactivated successfully. Jan 23 17:30:35.561593 containerd[1675]: time="2026-01-23T17:30:35.561559104Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 23 17:30:35.901974 containerd[1675]: time="2026-01-23T17:30:35.901925254Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 23 17:30:35.903317 containerd[1675]: time="2026-01-23T17:30:35.903263580Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 23 17:30:35.903380 containerd[1675]: time="2026-01-23T17:30:35.903329620Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 23 17:30:35.903536 kubelet[2900]: E0123 17:30:35.903484 2900 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:30:35.903536 kubelet[2900]: E0123 17:30:35.903528 2900 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 23 17:30:35.904125 kubelet[2900]: E0123 17:30:35.903590 2900 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-dd7cc585d-hrzqq_calico-apiserver(bdc6b38c-9de0-4613-a814-03b925209707): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 23 17:30:35.904125 kubelet[2900]: E0123 17:30:35.903621 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-hrzqq" podUID="bdc6b38c-9de0-4613-a814-03b925209707" Jan 23 17:30:36.171752 kubelet[2900]: I0123 17:30:36.171653 2900 scope.go:117] "RemoveContainer" containerID="64fdb14b02771a41d2c5ca95d13c1a2764265d05f896186cb5140b274582ca06" Jan 23 17:30:36.172140 kubelet[2900]: I0123 17:30:36.172043 2900 scope.go:117] "RemoveContainer" containerID="f8178449d120fb7547aaec3675f897f8f31d33319699114984209fed95dbff05" Jan 23 17:30:36.172338 kubelet[2900]: E0123 17:30:36.172290 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-65cdcdfd6d-8gtkc_tigera-operator(dc847d02-b19a-44d5-92d4-7ea7f8b8f467)\"" pod="tigera-operator/tigera-operator-65cdcdfd6d-8gtkc" podUID="dc847d02-b19a-44d5-92d4-7ea7f8b8f467" Jan 23 17:30:36.173623 containerd[1675]: time="2026-01-23T17:30:36.173592026Z" level=info msg="RemoveContainer for \"64fdb14b02771a41d2c5ca95d13c1a2764265d05f896186cb5140b274582ca06\"" Jan 23 17:30:36.178678 containerd[1675]: time="2026-01-23T17:30:36.178631891Z" level=info msg="RemoveContainer for \"64fdb14b02771a41d2c5ca95d13c1a2764265d05f896186cb5140b274582ca06\" returns successfully" Jan 23 17:30:36.564369 kubelet[2900]: E0123 17:30:36.562503 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-68b765b4fc-4fz2b" podUID="348a8202-26ae-4e13-80ce-0b3caa537034" Jan 23 17:30:37.561565 kubelet[2900]: E0123 17:30:37.561514 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-ssq4v" podUID="4e4a5b57-4234-49d1-b775-45759cc5cd06" Jan 23 17:30:38.025516 kubelet[2900]: E0123 17:30:38.025396 2900 kubelet_node_status.go:486] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"NetworkUnavailable\\\"},{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T17:30:28Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T17:30:28Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T17:30:28Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-23T17:30:28Z\\\",\\\"type\\\":\\\"Ready\\\"}]}}\" for node \"ci-4547-1-0-a-d0877fd079\": Patch \"https://10.0.1.37:6443/api/v1/nodes/ci-4547-1-0-a-d0877fd079/status?timeout=10s\": context deadline exceeded" Jan 23 17:30:39.561477 kubelet[2900]: E0123 17:30:39.561396 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5bcfdccd6d-ph9sw" podUID="d8d63139-4f9a-445f-a5a9-c69b14220343" Jan 23 17:30:40.562583 kubelet[2900]: E0123 17:30:40.562541 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-dd7cc585d-gp992" podUID="bad712f1-634f-4629-aa4d-a4a0636cb622" Jan 23 17:30:40.562971 kubelet[2900]: E0123 17:30:40.562679 2900 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-p68bw" podUID="baa2fc5a-f231-4d23-992e-37e27c865a7c" Jan 23 17:30:40.915373 kernel: pcieport 0000:00:01.0: pciehp: Slot(0): Button press: will power off in 5 sec Jan 23 17:30:43.465320 kubelet[2900]: E0123 17:30:43.465232 2900 controller.go:195] "Failed to update lease" err="Put \"https://10.0.1.37:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-1-0-a-d0877fd079?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)"