Jan 16 17:58:44.403334 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jan 16 17:58:44.403357 kernel: Linux version 6.12.65-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT Fri Jan 16 03:04:27 -00 2026 Jan 16 17:58:44.403367 kernel: KASLR enabled Jan 16 17:58:44.403372 kernel: efi: EFI v2.7 by EDK II Jan 16 17:58:44.403378 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438357218 Jan 16 17:58:44.403384 kernel: random: crng init done Jan 16 17:58:44.403391 kernel: secureboot: Secure boot disabled Jan 16 17:58:44.403397 kernel: ACPI: Early table checksum verification disabled Jan 16 17:58:44.403403 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Jan 16 17:58:44.403411 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Jan 16 17:58:44.403417 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:58:44.403437 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:58:44.403443 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:58:44.403449 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:58:44.403458 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:58:44.403465 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:58:44.403472 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:58:44.403478 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:58:44.403485 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:58:44.403491 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:58:44.403498 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Jan 16 17:58:44.403504 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jan 16 17:58:44.403511 kernel: ACPI: Use ACPI SPCR as default console: Yes Jan 16 17:58:44.403518 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Jan 16 17:58:44.403524 kernel: NODE_DATA(0) allocated [mem 0x43dff2a00-0x43dff9fff] Jan 16 17:58:44.403531 kernel: Zone ranges: Jan 16 17:58:44.403537 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jan 16 17:58:44.403543 kernel: DMA32 empty Jan 16 17:58:44.403550 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Jan 16 17:58:44.403556 kernel: Device empty Jan 16 17:58:44.403563 kernel: Movable zone start for each node Jan 16 17:58:44.403569 kernel: Early memory node ranges Jan 16 17:58:44.403575 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Jan 16 17:58:44.403582 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Jan 16 17:58:44.403588 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Jan 16 17:58:44.403596 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Jan 16 17:58:44.403602 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Jan 16 17:58:44.403609 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Jan 16 17:58:44.403615 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Jan 16 17:58:44.403622 kernel: psci: probing for conduit method from ACPI. Jan 16 17:58:44.403631 kernel: psci: PSCIv1.3 detected in firmware. Jan 16 17:58:44.403639 kernel: psci: Using standard PSCI v0.2 function IDs Jan 16 17:58:44.403645 kernel: psci: Trusted OS migration not required Jan 16 17:58:44.403652 kernel: psci: SMC Calling Convention v1.1 Jan 16 17:58:44.403659 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jan 16 17:58:44.403666 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Jan 16 17:58:44.403673 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Jan 16 17:58:44.403679 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Jan 16 17:58:44.403686 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Jan 16 17:58:44.403694 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jan 16 17:58:44.403701 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jan 16 17:58:44.403708 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Jan 16 17:58:44.403715 kernel: Detected PIPT I-cache on CPU0 Jan 16 17:58:44.403721 kernel: CPU features: detected: GIC system register CPU interface Jan 16 17:58:44.403728 kernel: CPU features: detected: Spectre-v4 Jan 16 17:58:44.403735 kernel: CPU features: detected: Spectre-BHB Jan 16 17:58:44.403742 kernel: CPU features: kernel page table isolation forced ON by KASLR Jan 16 17:58:44.403748 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jan 16 17:58:44.403755 kernel: CPU features: detected: ARM erratum 1418040 Jan 16 17:58:44.403762 kernel: CPU features: detected: SSBS not fully self-synchronizing Jan 16 17:58:44.403770 kernel: alternatives: applying boot alternatives Jan 16 17:58:44.403778 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=924fb3eb04ba1d8edcb66284d30e3342855b0579b62556e7722bcf37e82bda13 Jan 16 17:58:44.403785 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Jan 16 17:58:44.403791 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Jan 16 17:58:44.403799 kernel: Fallback order for Node 0: 0 Jan 16 17:58:44.403805 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Jan 16 17:58:44.403812 kernel: Policy zone: Normal Jan 16 17:58:44.403819 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 16 17:58:44.403825 kernel: software IO TLB: area num 4. Jan 16 17:58:44.403832 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Jan 16 17:58:44.403840 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Jan 16 17:58:44.403847 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 16 17:58:44.403854 kernel: rcu: RCU event tracing is enabled. Jan 16 17:58:44.403861 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Jan 16 17:58:44.403868 kernel: Trampoline variant of Tasks RCU enabled. Jan 16 17:58:44.403875 kernel: Tracing variant of Tasks RCU enabled. Jan 16 17:58:44.403882 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 16 17:58:44.403889 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Jan 16 17:58:44.403896 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 16 17:58:44.403903 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 16 17:58:44.403910 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jan 16 17:58:44.403918 kernel: GICv3: 256 SPIs implemented Jan 16 17:58:44.403924 kernel: GICv3: 0 Extended SPIs implemented Jan 16 17:58:44.403931 kernel: Root IRQ handler: gic_handle_irq Jan 16 17:58:44.403938 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jan 16 17:58:44.403944 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jan 16 17:58:44.403951 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jan 16 17:58:44.403958 kernel: ITS [mem 0x08080000-0x0809ffff] Jan 16 17:58:44.403965 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Jan 16 17:58:44.403972 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Jan 16 17:58:44.403978 kernel: GICv3: using LPI property table @0x0000000100130000 Jan 16 17:58:44.403985 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Jan 16 17:58:44.403992 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 16 17:58:44.404000 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 16 17:58:44.404007 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jan 16 17:58:44.404014 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jan 16 17:58:44.404021 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jan 16 17:58:44.404028 kernel: arm-pv: using stolen time PV Jan 16 17:58:44.404035 kernel: Console: colour dummy device 80x25 Jan 16 17:58:44.404042 kernel: ACPI: Core revision 20240827 Jan 16 17:58:44.404050 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jan 16 17:58:44.404058 kernel: pid_max: default: 32768 minimum: 301 Jan 16 17:58:44.404065 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 16 17:58:44.404073 kernel: landlock: Up and running. Jan 16 17:58:44.404080 kernel: SELinux: Initializing. Jan 16 17:58:44.404087 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 16 17:58:44.404094 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 16 17:58:44.404101 kernel: rcu: Hierarchical SRCU implementation. Jan 16 17:58:44.404109 kernel: rcu: Max phase no-delay instances is 400. Jan 16 17:58:44.404117 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 16 17:58:44.404125 kernel: Remapping and enabling EFI services. Jan 16 17:58:44.404132 kernel: smp: Bringing up secondary CPUs ... Jan 16 17:58:44.404139 kernel: Detected PIPT I-cache on CPU1 Jan 16 17:58:44.404146 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jan 16 17:58:44.404153 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Jan 16 17:58:44.404160 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 16 17:58:44.404169 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jan 16 17:58:44.404176 kernel: Detected PIPT I-cache on CPU2 Jan 16 17:58:44.404188 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Jan 16 17:58:44.404197 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Jan 16 17:58:44.404204 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 16 17:58:44.404211 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Jan 16 17:58:44.404219 kernel: Detected PIPT I-cache on CPU3 Jan 16 17:58:44.404226 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Jan 16 17:58:44.404235 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Jan 16 17:58:44.404243 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 16 17:58:44.404250 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Jan 16 17:58:44.404258 kernel: smp: Brought up 1 node, 4 CPUs Jan 16 17:58:44.404265 kernel: SMP: Total of 4 processors activated. Jan 16 17:58:44.404273 kernel: CPU: All CPU(s) started at EL1 Jan 16 17:58:44.404281 kernel: CPU features: detected: 32-bit EL0 Support Jan 16 17:58:44.404289 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jan 16 17:58:44.404297 kernel: CPU features: detected: Common not Private translations Jan 16 17:58:44.404304 kernel: CPU features: detected: CRC32 instructions Jan 16 17:58:44.404312 kernel: CPU features: detected: Enhanced Virtualization Traps Jan 16 17:58:44.404320 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jan 16 17:58:44.404327 kernel: CPU features: detected: LSE atomic instructions Jan 16 17:58:44.404336 kernel: CPU features: detected: Privileged Access Never Jan 16 17:58:44.404343 kernel: CPU features: detected: RAS Extension Support Jan 16 17:58:44.404350 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jan 16 17:58:44.404358 kernel: alternatives: applying system-wide alternatives Jan 16 17:58:44.404366 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Jan 16 17:58:44.404374 kernel: Memory: 16324372K/16777216K available (11200K kernel code, 2458K rwdata, 9092K rodata, 12480K init, 1038K bss, 430060K reserved, 16384K cma-reserved) Jan 16 17:58:44.404381 kernel: devtmpfs: initialized Jan 16 17:58:44.404390 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 16 17:58:44.404398 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Jan 16 17:58:44.404405 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jan 16 17:58:44.404412 kernel: 0 pages in range for non-PLT usage Jan 16 17:58:44.404428 kernel: 515152 pages in range for PLT usage Jan 16 17:58:44.404437 kernel: pinctrl core: initialized pinctrl subsystem Jan 16 17:58:44.404445 kernel: SMBIOS 3.0.0 present. Jan 16 17:58:44.404455 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Jan 16 17:58:44.404465 kernel: DMI: Memory slots populated: 1/1 Jan 16 17:58:44.404473 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 16 17:58:44.404480 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Jan 16 17:58:44.404488 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jan 16 17:58:44.404495 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jan 16 17:58:44.404503 kernel: audit: initializing netlink subsys (disabled) Jan 16 17:58:44.404511 kernel: audit: type=2000 audit(0.034:1): state=initialized audit_enabled=0 res=1 Jan 16 17:58:44.404519 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 16 17:58:44.404527 kernel: cpuidle: using governor menu Jan 16 17:58:44.404535 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jan 16 17:58:44.404542 kernel: ASID allocator initialised with 32768 entries Jan 16 17:58:44.404549 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 16 17:58:44.404557 kernel: Serial: AMBA PL011 UART driver Jan 16 17:58:44.404564 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 16 17:58:44.404573 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jan 16 17:58:44.404580 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jan 16 17:58:44.404588 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jan 16 17:58:44.404595 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 16 17:58:44.404603 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jan 16 17:58:44.404610 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jan 16 17:58:44.404617 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jan 16 17:58:44.404626 kernel: ACPI: Added _OSI(Module Device) Jan 16 17:58:44.404633 kernel: ACPI: Added _OSI(Processor Device) Jan 16 17:58:44.404641 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 16 17:58:44.404648 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 16 17:58:44.404655 kernel: ACPI: Interpreter enabled Jan 16 17:58:44.404663 kernel: ACPI: Using GIC for interrupt routing Jan 16 17:58:44.404670 kernel: ACPI: MCFG table detected, 1 entries Jan 16 17:58:44.404677 kernel: ACPI: CPU0 has been hot-added Jan 16 17:58:44.404686 kernel: ACPI: CPU1 has been hot-added Jan 16 17:58:44.404693 kernel: ACPI: CPU2 has been hot-added Jan 16 17:58:44.404701 kernel: ACPI: CPU3 has been hot-added Jan 16 17:58:44.404708 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jan 16 17:58:44.404716 kernel: printk: legacy console [ttyAMA0] enabled Jan 16 17:58:44.404723 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 16 17:58:44.404901 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 16 17:58:44.404994 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 16 17:58:44.405075 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 16 17:58:44.405155 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jan 16 17:58:44.405235 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jan 16 17:58:44.405244 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jan 16 17:58:44.405252 kernel: PCI host bridge to bus 0000:00 Jan 16 17:58:44.405341 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jan 16 17:58:44.405417 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jan 16 17:58:44.405514 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jan 16 17:58:44.405599 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 16 17:58:44.405699 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jan 16 17:58:44.405804 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.405890 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Jan 16 17:58:44.405998 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 16 17:58:44.406078 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Jan 16 17:58:44.406157 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Jan 16 17:58:44.406245 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.406327 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Jan 16 17:58:44.406406 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 16 17:58:44.406498 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Jan 16 17:58:44.406585 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.406665 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Jan 16 17:58:44.406747 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 16 17:58:44.406826 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Jan 16 17:58:44.406905 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Jan 16 17:58:44.406992 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.407071 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Jan 16 17:58:44.407149 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 16 17:58:44.407230 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Jan 16 17:58:44.407317 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.407396 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Jan 16 17:58:44.407502 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 16 17:58:44.407582 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Jan 16 17:58:44.407661 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Jan 16 17:58:44.407751 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.407829 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Jan 16 17:58:44.407906 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 16 17:58:44.407984 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Jan 16 17:58:44.408062 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Jan 16 17:58:44.408145 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.408225 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Jan 16 17:58:44.408302 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 16 17:58:44.408387 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.408483 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Jan 16 17:58:44.408564 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 16 17:58:44.408652 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.408735 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Jan 16 17:58:44.408832 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 16 17:58:44.408921 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.409000 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Jan 16 17:58:44.409081 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 16 17:58:44.409165 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.409243 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Jan 16 17:58:44.409321 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 16 17:58:44.409407 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.409498 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Jan 16 17:58:44.409583 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 16 17:58:44.409670 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.409751 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Jan 16 17:58:44.409830 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 16 17:58:44.409916 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.409996 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Jan 16 17:58:44.410095 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 16 17:58:44.410214 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.410294 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Jan 16 17:58:44.410374 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 16 17:58:44.410470 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.410555 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Jan 16 17:58:44.410636 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 16 17:58:44.410729 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.410819 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Jan 16 17:58:44.410897 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 16 17:58:44.410982 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.411064 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Jan 16 17:58:44.411142 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 16 17:58:44.411220 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Jan 16 17:58:44.411298 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Jan 16 17:58:44.411385 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.411480 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Jan 16 17:58:44.411563 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 16 17:58:44.411642 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Jan 16 17:58:44.411722 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Jan 16 17:58:44.411808 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.411886 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Jan 16 17:58:44.411963 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 16 17:58:44.412042 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Jan 16 17:58:44.412119 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Jan 16 17:58:44.412205 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.412285 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Jan 16 17:58:44.412363 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 16 17:58:44.412449 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Jan 16 17:58:44.412535 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Jan 16 17:58:44.412627 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.412708 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Jan 16 17:58:44.412801 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 16 17:58:44.412881 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Jan 16 17:58:44.412959 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Jan 16 17:58:44.413046 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.413125 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Jan 16 17:58:44.413203 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 16 17:58:44.413281 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Jan 16 17:58:44.413358 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Jan 16 17:58:44.413451 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.413533 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Jan 16 17:58:44.413612 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 16 17:58:44.413691 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Jan 16 17:58:44.413771 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Jan 16 17:58:44.413859 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.413939 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Jan 16 17:58:44.414019 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 16 17:58:44.414098 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Jan 16 17:58:44.414178 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Jan 16 17:58:44.414264 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.414344 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Jan 16 17:58:44.414434 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 16 17:58:44.414526 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Jan 16 17:58:44.414606 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Jan 16 17:58:44.414692 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.414772 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Jan 16 17:58:44.414850 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 16 17:58:44.414929 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Jan 16 17:58:44.415011 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Jan 16 17:58:44.415099 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.415180 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Jan 16 17:58:44.415262 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 16 17:58:44.415361 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Jan 16 17:58:44.415450 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Jan 16 17:58:44.415540 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.415619 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Jan 16 17:58:44.415699 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 16 17:58:44.415778 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Jan 16 17:58:44.415856 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Jan 16 17:58:44.415944 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.416036 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Jan 16 17:58:44.416118 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 16 17:58:44.416196 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Jan 16 17:58:44.416279 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Jan 16 17:58:44.416371 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.416461 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Jan 16 17:58:44.416544 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 16 17:58:44.416623 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Jan 16 17:58:44.416701 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Jan 16 17:58:44.416814 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.416899 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Jan 16 17:58:44.416978 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 16 17:58:44.417056 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Jan 16 17:58:44.417135 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Jan 16 17:58:44.417219 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:58:44.417302 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Jan 16 17:58:44.417382 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 16 17:58:44.417474 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Jan 16 17:58:44.417555 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Jan 16 17:58:44.417644 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 16 17:58:44.417726 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Jan 16 17:58:44.417810 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jan 16 17:58:44.417890 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 16 17:58:44.417977 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jan 16 17:58:44.418059 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Jan 16 17:58:44.418146 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 16 17:58:44.418230 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Jan 16 17:58:44.418312 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Jan 16 17:58:44.418402 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 16 17:58:44.418500 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Jan 16 17:58:44.418595 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 16 17:58:44.418687 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Jan 16 17:58:44.418774 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Jan 16 17:58:44.418867 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Jan 16 17:58:44.418949 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Jan 16 17:58:44.419030 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Jan 16 17:58:44.419111 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jan 16 17:58:44.419196 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jan 16 17:58:44.419279 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jan 16 17:58:44.419363 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jan 16 17:58:44.419454 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jan 16 17:58:44.419537 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jan 16 17:58:44.419618 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 16 17:58:44.419698 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jan 16 17:58:44.419776 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jan 16 17:58:44.419857 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 16 17:58:44.419938 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jan 16 17:58:44.420016 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jan 16 17:58:44.420096 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 16 17:58:44.420175 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jan 16 17:58:44.420253 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Jan 16 17:58:44.420334 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 16 17:58:44.420415 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jan 16 17:58:44.420507 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jan 16 17:58:44.420589 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 16 17:58:44.420667 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Jan 16 17:58:44.420746 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Jan 16 17:58:44.420844 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 16 17:58:44.420927 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jan 16 17:58:44.421006 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jan 16 17:58:44.421088 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 16 17:58:44.421167 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jan 16 17:58:44.421245 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jan 16 17:58:44.421330 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 16 17:58:44.421410 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Jan 16 17:58:44.421498 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Jan 16 17:58:44.421581 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 16 17:58:44.421659 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 16 17:58:44.421737 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Jan 16 17:58:44.421821 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 16 17:58:44.421900 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Jan 16 17:58:44.421977 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Jan 16 17:58:44.422059 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 16 17:58:44.422137 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Jan 16 17:58:44.422215 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Jan 16 17:58:44.422298 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 16 17:58:44.422378 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Jan 16 17:58:44.422463 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Jan 16 17:58:44.422545 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 16 17:58:44.422624 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Jan 16 17:58:44.422707 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Jan 16 17:58:44.422789 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 16 17:58:44.422867 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Jan 16 17:58:44.422946 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Jan 16 17:58:44.423026 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 16 17:58:44.423105 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Jan 16 17:58:44.423192 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Jan 16 17:58:44.423276 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 16 17:58:44.423356 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Jan 16 17:58:44.423447 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Jan 16 17:58:44.423532 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 16 17:58:44.423617 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Jan 16 17:58:44.423696 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Jan 16 17:58:44.423778 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 16 17:58:44.423858 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Jan 16 17:58:44.423939 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Jan 16 17:58:44.424025 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 16 17:58:44.424107 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Jan 16 17:58:44.424186 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Jan 16 17:58:44.424269 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 16 17:58:44.424349 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Jan 16 17:58:44.424436 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Jan 16 17:58:44.424520 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 16 17:58:44.424602 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Jan 16 17:58:44.424682 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Jan 16 17:58:44.424782 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 16 17:58:44.424869 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Jan 16 17:58:44.424950 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Jan 16 17:58:44.425035 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 16 17:58:44.425116 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Jan 16 17:58:44.425199 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Jan 16 17:58:44.425283 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 16 17:58:44.425366 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Jan 16 17:58:44.425463 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Jan 16 17:58:44.425572 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 16 17:58:44.425654 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Jan 16 17:58:44.425736 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Jan 16 17:58:44.425817 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 16 17:58:44.425896 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Jan 16 17:58:44.425977 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Jan 16 17:58:44.426058 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 16 17:58:44.426137 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Jan 16 17:58:44.426218 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Jan 16 17:58:44.426300 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 16 17:58:44.426379 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Jan 16 17:58:44.426470 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Jan 16 17:58:44.426561 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 16 17:58:44.426645 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Jan 16 17:58:44.426724 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Jan 16 17:58:44.426806 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 16 17:58:44.426885 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Jan 16 17:58:44.426966 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Jan 16 17:58:44.427046 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 16 17:58:44.427145 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Jan 16 17:58:44.427224 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Jan 16 17:58:44.427305 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Jan 16 17:58:44.427387 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Jan 16 17:58:44.427479 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Jan 16 17:58:44.427558 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Jan 16 17:58:44.427639 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Jan 16 17:58:44.427720 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Jan 16 17:58:44.427801 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Jan 16 17:58:44.427881 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Jan 16 17:58:44.427968 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Jan 16 17:58:44.428049 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Jan 16 17:58:44.428131 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Jan 16 17:58:44.428211 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Jan 16 17:58:44.428291 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Jan 16 17:58:44.428371 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Jan 16 17:58:44.428463 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Jan 16 17:58:44.428544 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Jan 16 17:58:44.428625 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Jan 16 17:58:44.428706 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Jan 16 17:58:44.428805 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Jan 16 17:58:44.428892 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Jan 16 17:58:44.428975 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Jan 16 17:58:44.429055 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Jan 16 17:58:44.429138 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Jan 16 17:58:44.429221 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Jan 16 17:58:44.429305 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Jan 16 17:58:44.429405 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Jan 16 17:58:44.429498 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Jan 16 17:58:44.429584 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Jan 16 17:58:44.429667 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Jan 16 17:58:44.429747 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Jan 16 17:58:44.429829 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Jan 16 17:58:44.429907 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Jan 16 17:58:44.429987 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Jan 16 17:58:44.430068 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Jan 16 17:58:44.430149 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Jan 16 17:58:44.430239 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Jan 16 17:58:44.430324 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Jan 16 17:58:44.430408 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Jan 16 17:58:44.430501 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Jan 16 17:58:44.430585 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Jan 16 17:58:44.430665 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Jan 16 17:58:44.430743 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Jan 16 17:58:44.430842 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Jan 16 17:58:44.430920 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Jan 16 17:58:44.431001 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Jan 16 17:58:44.431079 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Jan 16 17:58:44.431161 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Jan 16 17:58:44.431240 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Jan 16 17:58:44.431320 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Jan 16 17:58:44.431398 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Jan 16 17:58:44.431497 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Jan 16 17:58:44.431579 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Jan 16 17:58:44.431662 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Jan 16 17:58:44.431742 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Jan 16 17:58:44.431821 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Jan 16 17:58:44.431900 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Jan 16 17:58:44.431979 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Jan 16 17:58:44.432058 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Jan 16 17:58:44.432138 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Jan 16 17:58:44.432216 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Jan 16 17:58:44.432296 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Jan 16 17:58:44.432375 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Jan 16 17:58:44.432463 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Jan 16 17:58:44.432543 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Jan 16 17:58:44.432621 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Jan 16 17:58:44.432701 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Jan 16 17:58:44.432799 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Jan 16 17:58:44.432880 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Jan 16 17:58:44.432959 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Jan 16 17:58:44.433037 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Jan 16 17:58:44.433115 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Jan 16 17:58:44.433197 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Jan 16 17:58:44.433276 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Jan 16 17:58:44.433355 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Jan 16 17:58:44.433442 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Jan 16 17:58:44.433522 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Jan 16 17:58:44.433602 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Jan 16 17:58:44.433680 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Jan 16 17:58:44.433762 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Jan 16 17:58:44.433841 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Jan 16 17:58:44.433920 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Jan 16 17:58:44.433999 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Jan 16 17:58:44.434079 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Jan 16 17:58:44.434157 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Jan 16 17:58:44.434238 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Jan 16 17:58:44.434316 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Jan 16 17:58:44.434394 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Jan 16 17:58:44.434481 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Jan 16 17:58:44.434561 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Jan 16 17:58:44.434639 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Jan 16 17:58:44.434719 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Jan 16 17:58:44.434799 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Jan 16 17:58:44.434878 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Jan 16 17:58:44.434956 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Jan 16 17:58:44.435035 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Jan 16 17:58:44.435117 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Jan 16 17:58:44.435197 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Jan 16 17:58:44.435276 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.435354 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.435445 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Jan 16 17:58:44.435525 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.435609 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.435691 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Jan 16 17:58:44.435772 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.435857 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.435935 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Jan 16 17:58:44.436016 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.436098 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.436180 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Jan 16 17:58:44.436260 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.436346 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.436436 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Jan 16 17:58:44.436521 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.436602 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.436684 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Jan 16 17:58:44.436773 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.436856 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.436935 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Jan 16 17:58:44.437014 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.437092 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.437175 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Jan 16 17:58:44.437254 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.437333 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.437434 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Jan 16 17:58:44.437519 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.437600 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.437680 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Jan 16 17:58:44.437762 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.437843 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.437923 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Jan 16 17:58:44.438003 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.438083 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.438165 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Jan 16 17:58:44.438248 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.438332 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.438414 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Jan 16 17:58:44.438511 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.438612 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.438695 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Jan 16 17:58:44.438783 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.438864 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.438947 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Jan 16 17:58:44.439029 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.439111 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.439210 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Jan 16 17:58:44.439293 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.439379 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.439468 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Jan 16 17:58:44.439547 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.439626 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.439705 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Jan 16 17:58:44.439784 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Jan 16 17:58:44.439872 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Jan 16 17:58:44.439951 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Jan 16 17:58:44.440031 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Jan 16 17:58:44.440115 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Jan 16 17:58:44.440195 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Jan 16 17:58:44.440276 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Jan 16 17:58:44.440355 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Jan 16 17:58:44.440449 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Jan 16 17:58:44.440534 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Jan 16 17:58:44.440614 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Jan 16 17:58:44.440694 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Jan 16 17:58:44.440795 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Jan 16 17:58:44.440881 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Jan 16 17:58:44.440964 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.441043 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.441125 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.441205 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.441284 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.441364 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.441453 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.441534 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.441615 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.441698 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.441778 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.441858 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.441939 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.442020 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.442101 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.442191 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.442272 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.442355 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.442443 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.442522 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.442602 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.442684 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.442766 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.442848 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.442948 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.443027 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.443107 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.443189 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.443269 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.443348 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.443442 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.443530 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.443619 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.443701 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.443783 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Jan 16 17:58:44.443863 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Jan 16 17:58:44.443953 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Jan 16 17:58:44.444036 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jan 16 17:58:44.444120 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Jan 16 17:58:44.444200 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 16 17:58:44.444281 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Jan 16 17:58:44.444365 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jan 16 17:58:44.444467 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Jan 16 17:58:44.444550 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 16 17:58:44.444633 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Jan 16 17:58:44.444715 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jan 16 17:58:44.444816 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Jan 16 17:58:44.444900 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Jan 16 17:58:44.444979 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 16 17:58:44.445058 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Jan 16 17:58:44.445140 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jan 16 17:58:44.445225 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Jan 16 17:58:44.445308 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 16 17:58:44.445386 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Jan 16 17:58:44.445511 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jan 16 17:58:44.445602 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Jan 16 17:58:44.445690 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Jan 16 17:58:44.445770 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 16 17:58:44.445852 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Jan 16 17:58:44.445932 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jan 16 17:58:44.446017 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Jan 16 17:58:44.446099 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Jan 16 17:58:44.446189 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 16 17:58:44.446273 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Jan 16 17:58:44.446353 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 16 17:58:44.446448 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 16 17:58:44.446530 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Jan 16 17:58:44.446612 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 16 17:58:44.446700 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 16 17:58:44.446784 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Jan 16 17:58:44.446868 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 16 17:58:44.446955 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 16 17:58:44.447057 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Jan 16 17:58:44.447140 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jan 16 17:58:44.447222 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 16 17:58:44.447301 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Jan 16 17:58:44.447379 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Jan 16 17:58:44.447467 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 16 17:58:44.447546 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Jan 16 17:58:44.447627 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Jan 16 17:58:44.447706 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 16 17:58:44.447785 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Jan 16 17:58:44.447871 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Jan 16 17:58:44.447951 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 16 17:58:44.448031 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Jan 16 17:58:44.448112 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Jan 16 17:58:44.448192 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 16 17:58:44.448273 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Jan 16 17:58:44.448351 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 16 17:58:44.448436 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 16 17:58:44.448518 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Jan 16 17:58:44.448597 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 16 17:58:44.448676 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 16 17:58:44.448755 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Jan 16 17:58:44.448858 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 16 17:58:44.448941 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 16 17:58:44.449020 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Jan 16 17:58:44.449098 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Jan 16 17:58:44.449177 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 16 17:58:44.449257 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Jan 16 17:58:44.449335 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Jan 16 17:58:44.449416 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 16 17:58:44.449507 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Jan 16 17:58:44.449586 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Jan 16 17:58:44.449663 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Jan 16 17:58:44.449743 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 16 17:58:44.449821 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Jan 16 17:58:44.449902 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Jan 16 17:58:44.449981 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Jan 16 17:58:44.450060 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 16 17:58:44.450138 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Jan 16 17:58:44.450217 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Jan 16 17:58:44.450295 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Jan 16 17:58:44.450375 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 16 17:58:44.450462 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Jan 16 17:58:44.450541 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Jan 16 17:58:44.450620 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 16 17:58:44.450701 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 16 17:58:44.450780 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Jan 16 17:58:44.450859 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Jan 16 17:58:44.450938 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 16 17:58:44.451021 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 16 17:58:44.451102 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Jan 16 17:58:44.451182 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Jan 16 17:58:44.451263 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 16 17:58:44.451344 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 16 17:58:44.451436 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Jan 16 17:58:44.451521 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Jan 16 17:58:44.451603 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Jan 16 17:58:44.451688 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 16 17:58:44.451785 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Jan 16 17:58:44.451872 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Jan 16 17:58:44.451953 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Jan 16 17:58:44.452039 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 16 17:58:44.452122 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Jan 16 17:58:44.452201 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Jan 16 17:58:44.452281 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Jan 16 17:58:44.452362 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 16 17:58:44.452451 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Jan 16 17:58:44.452536 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Jan 16 17:58:44.452617 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Jan 16 17:58:44.452713 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 16 17:58:44.452816 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Jan 16 17:58:44.452903 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Jan 16 17:58:44.452982 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Jan 16 17:58:44.453067 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 16 17:58:44.453146 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Jan 16 17:58:44.453230 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Jan 16 17:58:44.453329 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 16 17:58:44.453415 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 16 17:58:44.453520 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Jan 16 17:58:44.453601 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Jan 16 17:58:44.453680 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 16 17:58:44.453763 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 16 17:58:44.453846 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Jan 16 17:58:44.453927 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Jan 16 17:58:44.454006 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 16 17:58:44.454089 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 16 17:58:44.454169 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Jan 16 17:58:44.454248 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Jan 16 17:58:44.454326 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Jan 16 17:58:44.454408 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jan 16 17:58:44.454511 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jan 16 17:58:44.454586 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jan 16 17:58:44.454669 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jan 16 17:58:44.454743 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jan 16 17:58:44.454827 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jan 16 17:58:44.454900 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jan 16 17:58:44.454980 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jan 16 17:58:44.455053 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jan 16 17:58:44.455135 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jan 16 17:58:44.455210 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jan 16 17:58:44.455292 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jan 16 17:58:44.455366 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jan 16 17:58:44.455455 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jan 16 17:58:44.455532 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 16 17:58:44.455613 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jan 16 17:58:44.455690 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 16 17:58:44.455769 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jan 16 17:58:44.455842 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 16 17:58:44.455926 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jan 16 17:58:44.456000 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jan 16 17:58:44.456082 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Jan 16 17:58:44.456155 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Jan 16 17:58:44.456236 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Jan 16 17:58:44.456309 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Jan 16 17:58:44.456388 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Jan 16 17:58:44.456473 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Jan 16 17:58:44.456555 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Jan 16 17:58:44.456628 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Jan 16 17:58:44.456707 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Jan 16 17:58:44.456801 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 16 17:58:44.456889 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Jan 16 17:58:44.456964 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 16 17:58:44.457044 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Jan 16 17:58:44.457118 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 16 17:58:44.457201 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Jan 16 17:58:44.457275 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Jan 16 17:58:44.457357 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Jan 16 17:58:44.457448 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Jan 16 17:58:44.457532 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Jan 16 17:58:44.457606 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Jan 16 17:58:44.457682 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Jan 16 17:58:44.457764 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Jan 16 17:58:44.457837 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Jan 16 17:58:44.457910 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Jan 16 17:58:44.457989 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Jan 16 17:58:44.458062 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Jan 16 17:58:44.458136 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Jan 16 17:58:44.458215 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Jan 16 17:58:44.458288 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Jan 16 17:58:44.458361 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 16 17:58:44.458456 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Jan 16 17:58:44.458536 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Jan 16 17:58:44.458609 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 16 17:58:44.458688 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Jan 16 17:58:44.458762 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Jan 16 17:58:44.458834 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 16 17:58:44.458913 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Jan 16 17:58:44.458989 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Jan 16 17:58:44.459062 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Jan 16 17:58:44.459145 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Jan 16 17:58:44.459218 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Jan 16 17:58:44.459291 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Jan 16 17:58:44.459369 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 16 17:58:44.459455 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Jan 16 17:58:44.459530 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Jan 16 17:58:44.459609 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Jan 16 17:58:44.459682 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Jan 16 17:58:44.459755 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Jan 16 17:58:44.459835 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Jan 16 17:58:44.459909 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Jan 16 17:58:44.459981 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Jan 16 17:58:44.460060 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Jan 16 17:58:44.460133 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Jan 16 17:58:44.460206 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 16 17:58:44.460287 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Jan 16 17:58:44.460360 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Jan 16 17:58:44.460439 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 16 17:58:44.460523 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Jan 16 17:58:44.460596 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Jan 16 17:58:44.460669 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 16 17:58:44.460750 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Jan 16 17:58:44.460849 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Jan 16 17:58:44.460924 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Jan 16 17:58:44.460934 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jan 16 17:58:44.460943 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jan 16 17:58:44.460951 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jan 16 17:58:44.460962 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jan 16 17:58:44.460970 kernel: iommu: Default domain type: Translated Jan 16 17:58:44.460978 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jan 16 17:58:44.460986 kernel: efivars: Registered efivars operations Jan 16 17:58:44.460994 kernel: vgaarb: loaded Jan 16 17:58:44.461002 kernel: clocksource: Switched to clocksource arch_sys_counter Jan 16 17:58:44.461010 kernel: VFS: Disk quotas dquot_6.6.0 Jan 16 17:58:44.461019 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 16 17:58:44.461027 kernel: pnp: PnP ACPI init Jan 16 17:58:44.461120 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jan 16 17:58:44.461131 kernel: pnp: PnP ACPI: found 1 devices Jan 16 17:58:44.461139 kernel: NET: Registered PF_INET protocol family Jan 16 17:58:44.461148 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 16 17:58:44.461156 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Jan 16 17:58:44.461166 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 16 17:58:44.461174 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 16 17:58:44.461182 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Jan 16 17:58:44.461190 kernel: TCP: Hash tables configured (established 131072 bind 65536) Jan 16 17:58:44.461199 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 16 17:58:44.461207 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 16 17:58:44.461215 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 16 17:58:44.461303 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jan 16 17:58:44.461314 kernel: PCI: CLS 0 bytes, default 64 Jan 16 17:58:44.461323 kernel: kvm [1]: HYP mode not available Jan 16 17:58:44.461331 kernel: Initialise system trusted keyrings Jan 16 17:58:44.461339 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Jan 16 17:58:44.461347 kernel: Key type asymmetric registered Jan 16 17:58:44.461355 kernel: Asymmetric key parser 'x509' registered Jan 16 17:58:44.461364 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jan 16 17:58:44.461372 kernel: io scheduler mq-deadline registered Jan 16 17:58:44.461380 kernel: io scheduler kyber registered Jan 16 17:58:44.461388 kernel: io scheduler bfq registered Jan 16 17:58:44.461396 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jan 16 17:58:44.461501 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Jan 16 17:58:44.461586 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Jan 16 17:58:44.461668 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.461748 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Jan 16 17:58:44.461827 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Jan 16 17:58:44.461907 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.461987 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Jan 16 17:58:44.462066 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Jan 16 17:58:44.462147 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.462227 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Jan 16 17:58:44.462315 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Jan 16 17:58:44.462394 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.462483 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Jan 16 17:58:44.462562 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Jan 16 17:58:44.462643 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.462722 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Jan 16 17:58:44.462801 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Jan 16 17:58:44.462881 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.462961 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Jan 16 17:58:44.463040 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Jan 16 17:58:44.463121 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.463204 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Jan 16 17:58:44.463283 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Jan 16 17:58:44.463366 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.463377 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jan 16 17:58:44.463467 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Jan 16 17:58:44.463547 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Jan 16 17:58:44.463627 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.463707 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Jan 16 17:58:44.463786 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Jan 16 17:58:44.463864 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.463943 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Jan 16 17:58:44.464023 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Jan 16 17:58:44.464104 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.464186 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Jan 16 17:58:44.464276 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Jan 16 17:58:44.464356 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.464443 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Jan 16 17:58:44.464523 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Jan 16 17:58:44.464602 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.464683 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Jan 16 17:58:44.464773 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Jan 16 17:58:44.464860 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.464940 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Jan 16 17:58:44.465019 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Jan 16 17:58:44.465097 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.465180 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Jan 16 17:58:44.465259 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Jan 16 17:58:44.465337 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.465348 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jan 16 17:58:44.465434 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Jan 16 17:58:44.465516 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Jan 16 17:58:44.465597 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.465677 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Jan 16 17:58:44.465755 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Jan 16 17:58:44.465833 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.465913 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Jan 16 17:58:44.465991 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Jan 16 17:58:44.466069 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.466150 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Jan 16 17:58:44.466229 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Jan 16 17:58:44.466307 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.466386 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Jan 16 17:58:44.466473 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Jan 16 17:58:44.466553 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.466638 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Jan 16 17:58:44.466717 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Jan 16 17:58:44.466796 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.466876 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Jan 16 17:58:44.466956 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Jan 16 17:58:44.467034 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.467121 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Jan 16 17:58:44.467207 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Jan 16 17:58:44.467286 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.467298 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jan 16 17:58:44.467376 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Jan 16 17:58:44.467468 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Jan 16 17:58:44.467550 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.467634 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Jan 16 17:58:44.467713 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Jan 16 17:58:44.467798 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.467879 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Jan 16 17:58:44.467960 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Jan 16 17:58:44.468039 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.468123 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Jan 16 17:58:44.468204 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Jan 16 17:58:44.468284 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.468367 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Jan 16 17:58:44.468460 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Jan 16 17:58:44.468552 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.468649 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Jan 16 17:58:44.468744 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Jan 16 17:58:44.468845 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.468929 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Jan 16 17:58:44.469009 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Jan 16 17:58:44.469088 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.469169 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Jan 16 17:58:44.469250 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Jan 16 17:58:44.469336 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.469427 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Jan 16 17:58:44.469512 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Jan 16 17:58:44.469592 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:58:44.469603 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jan 16 17:58:44.469614 kernel: ACPI: button: Power Button [PWRB] Jan 16 17:58:44.469707 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Jan 16 17:58:44.469796 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jan 16 17:58:44.469807 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 16 17:58:44.469816 kernel: thunder_xcv, ver 1.0 Jan 16 17:58:44.469828 kernel: thunder_bgx, ver 1.0 Jan 16 17:58:44.469837 kernel: nicpf, ver 1.0 Jan 16 17:58:44.469847 kernel: nicvf, ver 1.0 Jan 16 17:58:44.469946 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jan 16 17:58:44.470024 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-01-16T17:58:43 UTC (1768586323) Jan 16 17:58:44.470034 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 16 17:58:44.470043 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jan 16 17:58:44.470051 kernel: watchdog: NMI not fully supported Jan 16 17:58:44.470061 kernel: watchdog: Hard watchdog permanently disabled Jan 16 17:58:44.470069 kernel: NET: Registered PF_INET6 protocol family Jan 16 17:58:44.470077 kernel: Segment Routing with IPv6 Jan 16 17:58:44.470085 kernel: In-situ OAM (IOAM) with IPv6 Jan 16 17:58:44.470093 kernel: NET: Registered PF_PACKET protocol family Jan 16 17:58:44.470101 kernel: Key type dns_resolver registered Jan 16 17:58:44.470109 kernel: registered taskstats version 1 Jan 16 17:58:44.470117 kernel: Loading compiled-in X.509 certificates Jan 16 17:58:44.470126 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.65-flatcar: 27e3aa638f3535434dc9dbdde4239fca944d5458' Jan 16 17:58:44.470134 kernel: Demotion targets for Node 0: null Jan 16 17:58:44.470142 kernel: Key type .fscrypt registered Jan 16 17:58:44.470150 kernel: Key type fscrypt-provisioning registered Jan 16 17:58:44.470158 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 16 17:58:44.470166 kernel: ima: Allocated hash algorithm: sha1 Jan 16 17:58:44.470174 kernel: ima: No architecture policies found Jan 16 17:58:44.470183 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jan 16 17:58:44.470191 kernel: clk: Disabling unused clocks Jan 16 17:58:44.470199 kernel: PM: genpd: Disabling unused power domains Jan 16 17:58:44.470207 kernel: Freeing unused kernel memory: 12480K Jan 16 17:58:44.470215 kernel: Run /init as init process Jan 16 17:58:44.470223 kernel: with arguments: Jan 16 17:58:44.470231 kernel: /init Jan 16 17:58:44.470240 kernel: with environment: Jan 16 17:58:44.470248 kernel: HOME=/ Jan 16 17:58:44.470256 kernel: TERM=linux Jan 16 17:58:44.470264 kernel: ACPI: bus type USB registered Jan 16 17:58:44.470272 kernel: usbcore: registered new interface driver usbfs Jan 16 17:58:44.470280 kernel: usbcore: registered new interface driver hub Jan 16 17:58:44.470288 kernel: usbcore: registered new device driver usb Jan 16 17:58:44.470391 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 16 17:58:44.470486 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 16 17:58:44.470570 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 16 17:58:44.470653 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 16 17:58:44.470738 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 16 17:58:44.470819 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 16 17:58:44.470925 kernel: hub 1-0:1.0: USB hub found Jan 16 17:58:44.471028 kernel: hub 1-0:1.0: 4 ports detected Jan 16 17:58:44.471138 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 16 17:58:44.471237 kernel: hub 2-0:1.0: USB hub found Jan 16 17:58:44.471331 kernel: hub 2-0:1.0: 4 ports detected Jan 16 17:58:44.471444 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Jan 16 17:58:44.471538 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 16 17:58:44.471549 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 16 17:58:44.471558 kernel: GPT:25804799 != 104857599 Jan 16 17:58:44.471566 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 16 17:58:44.471574 kernel: GPT:25804799 != 104857599 Jan 16 17:58:44.471582 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 16 17:58:44.471592 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 16 17:58:44.471600 kernel: SCSI subsystem initialized Jan 16 17:58:44.471609 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 16 17:58:44.471617 kernel: device-mapper: uevent: version 1.0.3 Jan 16 17:58:44.471626 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 16 17:58:44.471634 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jan 16 17:58:44.471644 kernel: raid6: neonx8 gen() 15829 MB/s Jan 16 17:58:44.471652 kernel: raid6: neonx4 gen() 15738 MB/s Jan 16 17:58:44.471660 kernel: raid6: neonx2 gen() 13207 MB/s Jan 16 17:58:44.471669 kernel: raid6: neonx1 gen() 10395 MB/s Jan 16 17:58:44.471677 kernel: raid6: int64x8 gen() 6832 MB/s Jan 16 17:58:44.471685 kernel: raid6: int64x4 gen() 7359 MB/s Jan 16 17:58:44.471693 kernel: raid6: int64x2 gen() 6123 MB/s Jan 16 17:58:44.471702 kernel: raid6: int64x1 gen() 5046 MB/s Jan 16 17:58:44.471711 kernel: raid6: using algorithm neonx8 gen() 15829 MB/s Jan 16 17:58:44.471719 kernel: raid6: .... xor() 12041 MB/s, rmw enabled Jan 16 17:58:44.471728 kernel: raid6: using neon recovery algorithm Jan 16 17:58:44.471736 kernel: xor: measuring software checksum speed Jan 16 17:58:44.471746 kernel: 8regs : 21596 MB/sec Jan 16 17:58:44.471754 kernel: 32regs : 20222 MB/sec Jan 16 17:58:44.471763 kernel: arm64_neon : 28061 MB/sec Jan 16 17:58:44.471772 kernel: xor: using function: arm64_neon (28061 MB/sec) Jan 16 17:58:44.471878 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 16 17:58:44.471890 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 16 17:58:44.471899 kernel: BTRFS: device fsid 772c9e2d-7e98-4acf-842c-b5416fff0f38 devid 1 transid 34 /dev/mapper/usr (253:0) scanned by mount (273) Jan 16 17:58:44.471908 kernel: BTRFS info (device dm-0): first mount of filesystem 772c9e2d-7e98-4acf-842c-b5416fff0f38 Jan 16 17:58:44.471916 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jan 16 17:58:44.471927 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 16 17:58:44.471935 kernel: BTRFS info (device dm-0): enabling free space tree Jan 16 17:58:44.471943 kernel: loop: module loaded Jan 16 17:58:44.471952 kernel: loop0: detected capacity change from 0 to 91832 Jan 16 17:58:44.471960 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 16 17:58:44.472062 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jan 16 17:58:44.472076 systemd[1]: Successfully made /usr/ read-only. Jan 16 17:58:44.472087 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 16 17:58:44.472097 systemd[1]: Detected virtualization kvm. Jan 16 17:58:44.472105 systemd[1]: Detected architecture arm64. Jan 16 17:58:44.472114 systemd[1]: Running in initrd. Jan 16 17:58:44.472122 systemd[1]: No hostname configured, using default hostname. Jan 16 17:58:44.472132 systemd[1]: Hostname set to . Jan 16 17:58:44.472141 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 16 17:58:44.472150 systemd[1]: Queued start job for default target initrd.target. Jan 16 17:58:44.472159 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 16 17:58:44.472168 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 16 17:58:44.472176 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 16 17:58:44.472187 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 16 17:58:44.472196 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 16 17:58:44.472206 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 16 17:58:44.472215 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 16 17:58:44.472224 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 16 17:58:44.472232 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 16 17:58:44.472242 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 16 17:58:44.472251 systemd[1]: Reached target paths.target - Path Units. Jan 16 17:58:44.472260 systemd[1]: Reached target slices.target - Slice Units. Jan 16 17:58:44.472268 systemd[1]: Reached target swap.target - Swaps. Jan 16 17:58:44.472277 systemd[1]: Reached target timers.target - Timer Units. Jan 16 17:58:44.472286 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 16 17:58:44.472294 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 16 17:58:44.472304 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 16 17:58:44.472313 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 16 17:58:44.472322 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 16 17:58:44.472331 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 16 17:58:44.472339 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 16 17:58:44.472348 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 16 17:58:44.472358 systemd[1]: Reached target sockets.target - Socket Units. Jan 16 17:58:44.472367 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 16 17:58:44.472376 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 16 17:58:44.472385 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 16 17:58:44.472393 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 16 17:58:44.472402 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 16 17:58:44.472411 systemd[1]: Starting systemd-fsck-usr.service... Jan 16 17:58:44.472431 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 16 17:58:44.472441 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 16 17:58:44.472451 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 16 17:58:44.472460 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 16 17:58:44.472471 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 16 17:58:44.472480 systemd[1]: Finished systemd-fsck-usr.service. Jan 16 17:58:44.472489 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 16 17:58:44.472518 systemd-journald[415]: Collecting audit messages is enabled. Jan 16 17:58:44.472541 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 16 17:58:44.472549 kernel: Bridge firewalling registered Jan 16 17:58:44.472558 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 16 17:58:44.472567 kernel: audit: type=1130 audit(1768586324.413:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.472576 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 16 17:58:44.472587 kernel: audit: type=1130 audit(1768586324.417:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.472596 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 16 17:58:44.472605 kernel: audit: type=1130 audit(1768586324.423:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.472614 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 16 17:58:44.472623 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 16 17:58:44.472632 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 16 17:58:44.472642 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 16 17:58:44.472651 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 16 17:58:44.472660 kernel: audit: type=1130 audit(1768586324.445:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.472670 kernel: audit: type=1334 audit(1768586324.445:6): prog-id=6 op=LOAD Jan 16 17:58:44.472678 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 16 17:58:44.472688 kernel: audit: type=1130 audit(1768586324.457:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.472697 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 16 17:58:44.472707 kernel: audit: type=1130 audit(1768586324.463:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.472716 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 16 17:58:44.472725 systemd-journald[415]: Journal started Jan 16 17:58:44.472744 systemd-journald[415]: Runtime Journal (/run/log/journal/3c7eb63aaaff442f99706db93d911aec) is 8M, max 319.5M, 311.5M free. Jan 16 17:58:44.413000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.417000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.423000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.445000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.445000 audit: BPF prog-id=6 op=LOAD Jan 16 17:58:44.457000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.463000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.410748 systemd-modules-load[419]: Inserted module 'br_netfilter' Jan 16 17:58:44.480527 systemd[1]: Started systemd-journald.service - Journal Service. Jan 16 17:58:44.480564 kernel: audit: type=1130 audit(1768586324.479:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.479000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.484251 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 16 17:58:44.488639 dracut-cmdline[444]: dracut-109 Jan 16 17:58:44.491468 dracut-cmdline[444]: Using kernel command line parameters: SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=924fb3eb04ba1d8edcb66284d30e3342855b0579b62556e7722bcf37e82bda13 Jan 16 17:58:44.501814 systemd-tmpfiles[457]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 16 17:58:44.501852 systemd-resolved[435]: Positive Trust Anchors: Jan 16 17:58:44.501862 systemd-resolved[435]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 16 17:58:44.501865 systemd-resolved[435]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 16 17:58:44.509000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.501895 systemd-resolved[435]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 16 17:58:44.519810 kernel: audit: type=1130 audit(1768586324.509:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.507664 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 16 17:58:44.529160 systemd-resolved[435]: Defaulting to hostname 'linux'. Jan 16 17:58:44.530071 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 16 17:58:44.531000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.532221 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 16 17:58:44.571446 kernel: Loading iSCSI transport class v2.0-870. Jan 16 17:58:44.583460 kernel: iscsi: registered transport (tcp) Jan 16 17:58:44.598460 kernel: iscsi: registered transport (qla4xxx) Jan 16 17:58:44.598487 kernel: QLogic iSCSI HBA Driver Jan 16 17:58:44.620181 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 16 17:58:44.635684 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 16 17:58:44.636000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.637173 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 16 17:58:44.681396 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 16 17:58:44.681000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.683663 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 16 17:58:44.685188 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 16 17:58:44.713743 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 16 17:58:44.714000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.715000 audit: BPF prog-id=7 op=LOAD Jan 16 17:58:44.715000 audit: BPF prog-id=8 op=LOAD Jan 16 17:58:44.716116 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 16 17:58:44.748124 systemd-udevd[686]: Using default interface naming scheme 'v257'. Jan 16 17:58:44.755760 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 16 17:58:44.756000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.758108 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 16 17:58:44.781935 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 16 17:58:44.782000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.785120 dracut-pre-trigger[764]: rd.md=0: removing MD RAID activation Jan 16 17:58:44.784000 audit: BPF prog-id=9 op=LOAD Jan 16 17:58:44.786725 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 16 17:58:44.807505 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 16 17:58:44.809000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.811038 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 16 17:58:44.831923 systemd-networkd[800]: lo: Link UP Jan 16 17:58:44.831931 systemd-networkd[800]: lo: Gained carrier Jan 16 17:58:44.833000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.832705 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 16 17:58:44.834020 systemd[1]: Reached target network.target - Network. Jan 16 17:58:44.894711 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 16 17:58:44.895000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:44.899135 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 16 17:58:44.964951 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 16 17:58:44.973628 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 16 17:58:44.982260 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 16 17:58:44.988500 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jan 16 17:58:44.988524 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 16 17:58:44.989667 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 16 17:58:44.993054 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jan 16 17:58:44.992650 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 16 17:58:45.016992 disk-uuid[864]: Primary Header is updated. Jan 16 17:58:45.016992 disk-uuid[864]: Secondary Entries is updated. Jan 16 17:58:45.016992 disk-uuid[864]: Secondary Header is updated. Jan 16 17:58:45.022312 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 16 17:58:45.022458 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 16 17:58:45.024000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:45.024687 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 16 17:58:45.025585 systemd-networkd[800]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 16 17:58:45.025588 systemd-networkd[800]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 16 17:58:45.026090 systemd-networkd[800]: eth0: Link UP Jan 16 17:58:45.027243 systemd-networkd[800]: eth0: Gained carrier Jan 16 17:58:45.027255 systemd-networkd[800]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 16 17:58:45.028183 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 16 17:58:45.048456 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jan 16 17:58:45.048668 kernel: usbcore: registered new interface driver usbhid Jan 16 17:58:45.048681 kernel: usbhid: USB HID core driver Jan 16 17:58:45.060159 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 16 17:58:45.061000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:45.083511 systemd-networkd[800]: eth0: DHCPv4 address 10.0.7.62/25, gateway 10.0.7.1 acquired from 10.0.7.1 Jan 16 17:58:45.110501 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 16 17:58:45.111000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:45.111942 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 16 17:58:45.113646 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 16 17:58:45.115470 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 16 17:58:45.118316 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 16 17:58:45.150264 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 16 17:58:45.150000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:46.054211 disk-uuid[868]: Warning: The kernel is still using the old partition table. Jan 16 17:58:46.054211 disk-uuid[868]: The new table will be used at the next reboot or after you Jan 16 17:58:46.054211 disk-uuid[868]: run partprobe(8) or kpartx(8) Jan 16 17:58:46.054211 disk-uuid[868]: The operation has completed successfully. Jan 16 17:58:46.059279 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 16 17:58:46.059382 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 16 17:58:46.060000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:46.062000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:46.063387 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 16 17:58:46.106439 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (899) Jan 16 17:58:46.108900 kernel: BTRFS info (device vda6): first mount of filesystem 5e96ab2e-f088-4ca2-ba97-55451a1893dc Jan 16 17:58:46.108958 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 16 17:58:46.113878 kernel: BTRFS info (device vda6): turning on async discard Jan 16 17:58:46.113963 kernel: BTRFS info (device vda6): enabling free space tree Jan 16 17:58:46.119436 kernel: BTRFS info (device vda6): last unmount of filesystem 5e96ab2e-f088-4ca2-ba97-55451a1893dc Jan 16 17:58:46.120059 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 16 17:58:46.121000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:46.122976 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 16 17:58:46.259113 ignition[918]: Ignition 2.24.0 Jan 16 17:58:46.260027 ignition[918]: Stage: fetch-offline Jan 16 17:58:46.260070 ignition[918]: no configs at "/usr/lib/ignition/base.d" Jan 16 17:58:46.260087 ignition[918]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 16 17:58:46.260248 ignition[918]: parsed url from cmdline: "" Jan 16 17:58:46.260251 ignition[918]: no config URL provided Jan 16 17:58:46.260904 ignition[918]: reading system config file "/usr/lib/ignition/user.ign" Jan 16 17:58:46.260917 ignition[918]: no config at "/usr/lib/ignition/user.ign" Jan 16 17:58:46.260922 ignition[918]: failed to fetch config: resource requires networking Jan 16 17:58:46.266775 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 16 17:58:46.263740 ignition[918]: Ignition finished successfully Jan 16 17:58:46.269000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:46.271129 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 16 17:58:46.297146 ignition[932]: Ignition 2.24.0 Jan 16 17:58:46.297166 ignition[932]: Stage: fetch Jan 16 17:58:46.297303 ignition[932]: no configs at "/usr/lib/ignition/base.d" Jan 16 17:58:46.297311 ignition[932]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 16 17:58:46.297385 ignition[932]: parsed url from cmdline: "" Jan 16 17:58:46.297389 ignition[932]: no config URL provided Jan 16 17:58:46.297393 ignition[932]: reading system config file "/usr/lib/ignition/user.ign" Jan 16 17:58:46.297398 ignition[932]: no config at "/usr/lib/ignition/user.ign" Jan 16 17:58:46.297572 ignition[932]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 16 17:58:46.297588 ignition[932]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 16 17:58:46.297594 ignition[932]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 16 17:58:46.769624 systemd-networkd[800]: eth0: Gained IPv6LL Jan 16 17:58:47.298109 ignition[932]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 16 17:58:47.298159 ignition[932]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 16 17:58:48.298728 ignition[932]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 16 17:58:48.298792 ignition[932]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 16 17:58:48.842355 ignition[932]: GET result: OK Jan 16 17:58:48.843593 ignition[932]: parsing config with SHA512: fb63cd8cd373e5fcb7897d2f884d0cf822ec90680362ba300fdb4840c0b8261dd441b4488aa383debbcc78572cbe0a13f8d828cb94ae87edcabdd59f74139bb8 Jan 16 17:58:48.848024 unknown[932]: fetched base config from "system" Jan 16 17:58:48.848341 ignition[932]: fetch: fetch complete Jan 16 17:58:48.848033 unknown[932]: fetched base config from "system" Jan 16 17:58:48.854472 kernel: kauditd_printk_skb: 20 callbacks suppressed Jan 16 17:58:48.854495 kernel: audit: type=1130 audit(1768586328.850:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:48.850000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:48.848346 ignition[932]: fetch: fetch passed Jan 16 17:58:48.848038 unknown[932]: fetched user config from "openstack" Jan 16 17:58:48.848383 ignition[932]: Ignition finished successfully Jan 16 17:58:48.849862 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 16 17:58:48.854544 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 16 17:58:48.886080 ignition[939]: Ignition 2.24.0 Jan 16 17:58:48.886100 ignition[939]: Stage: kargs Jan 16 17:58:48.886241 ignition[939]: no configs at "/usr/lib/ignition/base.d" Jan 16 17:58:48.886249 ignition[939]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 16 17:58:48.886975 ignition[939]: kargs: kargs passed Jan 16 17:58:48.889913 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 16 17:58:48.890000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:48.887018 ignition[939]: Ignition finished successfully Jan 16 17:58:48.895295 kernel: audit: type=1130 audit(1768586328.890:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:48.891827 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 16 17:58:48.918091 ignition[946]: Ignition 2.24.0 Jan 16 17:58:48.918111 ignition[946]: Stage: disks Jan 16 17:58:48.918253 ignition[946]: no configs at "/usr/lib/ignition/base.d" Jan 16 17:58:48.918261 ignition[946]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 16 17:58:48.919003 ignition[946]: disks: disks passed Jan 16 17:58:48.922000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:48.921672 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 16 17:58:48.928236 kernel: audit: type=1130 audit(1768586328.922:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:48.919044 ignition[946]: Ignition finished successfully Jan 16 17:58:48.923513 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 16 17:58:48.927456 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 16 17:58:48.929538 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 16 17:58:48.930993 systemd[1]: Reached target sysinit.target - System Initialization. Jan 16 17:58:48.932726 systemd[1]: Reached target basic.target - Basic System. Jan 16 17:58:48.935260 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 16 17:58:48.974649 systemd-fsck[955]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 16 17:58:48.978458 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 16 17:58:48.979000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:48.980468 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 16 17:58:48.985279 kernel: audit: type=1130 audit(1768586328.979:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:49.092474 kernel: EXT4-fs (vda9): mounted filesystem 3360ad79-d1e3-4f32-ae7d-4a8c0a3c719d r/w with ordered data mode. Quota mode: none. Jan 16 17:58:49.093150 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 16 17:58:49.094391 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 16 17:58:49.097590 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 16 17:58:49.099402 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 16 17:58:49.100322 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 16 17:58:49.100971 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 16 17:58:49.103458 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 16 17:58:49.103487 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 16 17:58:49.110312 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 16 17:58:49.112478 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 16 17:58:49.127471 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (963) Jan 16 17:58:49.130443 kernel: BTRFS info (device vda6): first mount of filesystem 5e96ab2e-f088-4ca2-ba97-55451a1893dc Jan 16 17:58:49.130486 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 16 17:58:49.142349 kernel: BTRFS info (device vda6): turning on async discard Jan 16 17:58:49.142407 kernel: BTRFS info (device vda6): enabling free space tree Jan 16 17:58:49.143651 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 16 17:58:49.167459 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 17:58:49.262666 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 16 17:58:49.263000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:49.264906 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 16 17:58:49.268280 kernel: audit: type=1130 audit(1768586329.263:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:49.268237 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 16 17:58:49.285575 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 16 17:58:49.287773 kernel: BTRFS info (device vda6): last unmount of filesystem 5e96ab2e-f088-4ca2-ba97-55451a1893dc Jan 16 17:58:49.303402 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 16 17:58:49.303000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:49.308480 kernel: audit: type=1130 audit(1768586329.303:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:49.313551 ignition[1063]: INFO : Ignition 2.24.0 Jan 16 17:58:49.313551 ignition[1063]: INFO : Stage: mount Jan 16 17:58:49.315039 ignition[1063]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 16 17:58:49.315039 ignition[1063]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 16 17:58:49.315039 ignition[1063]: INFO : mount: mount passed Jan 16 17:58:49.315039 ignition[1063]: INFO : Ignition finished successfully Jan 16 17:58:49.321906 kernel: audit: type=1130 audit(1768586329.317:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:49.317000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:49.316719 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 16 17:58:50.195463 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 17:58:52.205531 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 17:58:56.210497 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 17:58:56.218062 coreos-metadata[965]: Jan 16 17:58:56.217 WARN failed to locate config-drive, using the metadata service API instead Jan 16 17:58:56.237195 coreos-metadata[965]: Jan 16 17:58:56.237 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 16 17:58:57.522178 coreos-metadata[965]: Jan 16 17:58:57.522 INFO Fetch successful Jan 16 17:58:57.523934 coreos-metadata[965]: Jan 16 17:58:57.523 INFO wrote hostname ci-4580-0-0-p-7f6b5ebc40 to /sysroot/etc/hostname Jan 16 17:58:57.525077 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 16 17:58:57.531769 kernel: audit: type=1130 audit(1768586337.526:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:57.531793 kernel: audit: type=1131 audit(1768586337.526:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:57.526000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:57.526000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:57.525168 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 16 17:58:57.527418 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 16 17:58:57.549573 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 16 17:58:57.579451 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1082) Jan 16 17:58:57.582136 kernel: BTRFS info (device vda6): first mount of filesystem 5e96ab2e-f088-4ca2-ba97-55451a1893dc Jan 16 17:58:57.582165 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 16 17:58:57.586441 kernel: BTRFS info (device vda6): turning on async discard Jan 16 17:58:57.586481 kernel: BTRFS info (device vda6): enabling free space tree Jan 16 17:58:57.587644 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 16 17:58:57.612829 ignition[1100]: INFO : Ignition 2.24.0 Jan 16 17:58:57.612829 ignition[1100]: INFO : Stage: files Jan 16 17:58:57.614300 ignition[1100]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 16 17:58:57.614300 ignition[1100]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 16 17:58:57.614300 ignition[1100]: DEBUG : files: compiled without relabeling support, skipping Jan 16 17:58:57.617655 ignition[1100]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 16 17:58:57.617655 ignition[1100]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 16 17:58:57.624320 ignition[1100]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 16 17:58:57.625911 ignition[1100]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 16 17:58:57.625911 ignition[1100]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 16 17:58:57.624994 unknown[1100]: wrote ssh authorized keys file for user: core Jan 16 17:58:57.629376 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Jan 16 17:58:57.629376 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Jan 16 17:58:57.707762 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 16 17:58:57.818934 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Jan 16 17:58:57.818934 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 16 17:58:57.822571 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 16 17:58:57.822571 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 16 17:58:57.822571 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 16 17:58:57.822571 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 16 17:58:57.822571 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 16 17:58:57.822571 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 16 17:58:57.822571 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 16 17:58:57.833401 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 16 17:58:57.833401 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 16 17:58:57.833401 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jan 16 17:58:57.833401 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jan 16 17:58:57.833401 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jan 16 17:58:57.833401 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Jan 16 17:58:58.217609 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 16 17:58:59.828057 ignition[1100]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jan 16 17:58:59.828057 ignition[1100]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 16 17:58:59.831876 ignition[1100]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 16 17:58:59.835154 ignition[1100]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 16 17:58:59.835154 ignition[1100]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 16 17:58:59.835154 ignition[1100]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 16 17:58:59.835154 ignition[1100]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 16 17:58:59.835154 ignition[1100]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 16 17:58:59.835154 ignition[1100]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 16 17:58:59.835154 ignition[1100]: INFO : files: files passed Jan 16 17:58:59.835154 ignition[1100]: INFO : Ignition finished successfully Jan 16 17:58:59.849240 kernel: audit: type=1130 audit(1768586339.838:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:59.838000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:59.837484 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 16 17:58:59.840256 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 16 17:58:59.853389 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 16 17:58:59.855827 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 16 17:58:59.855928 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 16 17:58:59.862742 kernel: audit: type=1130 audit(1768586339.857:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:59.862768 kernel: audit: type=1131 audit(1768586339.857:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:59.857000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:59.857000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:59.865638 initrd-setup-root-after-ignition[1133]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 16 17:58:59.865638 initrd-setup-root-after-ignition[1133]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 16 17:58:59.868522 initrd-setup-root-after-ignition[1137]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 16 17:58:59.869000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:59.868064 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 16 17:58:59.875157 kernel: audit: type=1130 audit(1768586339.869:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:59.869797 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 16 17:58:59.874933 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 16 17:58:59.910822 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 16 17:58:59.910933 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 16 17:58:59.918659 kernel: audit: type=1130 audit(1768586339.912:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:59.918681 kernel: audit: type=1131 audit(1768586339.912:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:59.912000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:59.912000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:59.913185 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 16 17:58:59.919510 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 16 17:58:59.921290 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 16 17:58:59.922107 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 16 17:58:59.953784 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 16 17:58:59.954000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:59.956068 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 16 17:58:59.960102 kernel: audit: type=1130 audit(1768586339.954:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:59.977285 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 16 17:58:59.977409 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 16 17:58:59.979927 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 16 17:58:59.981799 systemd[1]: Stopped target timers.target - Timer Units. Jan 16 17:58:59.983368 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 16 17:58:59.984000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:59.983501 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 16 17:58:59.988963 kernel: audit: type=1131 audit(1768586339.984:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:59.988103 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 16 17:58:59.989899 systemd[1]: Stopped target basic.target - Basic System. Jan 16 17:58:59.991318 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 16 17:58:59.992950 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 16 17:58:59.994691 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 16 17:58:59.996509 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 16 17:58:59.998400 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 16 17:59:00.000166 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 16 17:59:00.001982 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 16 17:59:00.003766 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 16 17:59:00.005401 systemd[1]: Stopped target swap.target - Swaps. Jan 16 17:59:00.006844 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 16 17:59:00.008000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.006961 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 16 17:59:00.009162 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 16 17:59:00.011010 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 16 17:59:00.012737 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 16 17:59:00.012813 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 16 17:59:00.016000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.014593 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 16 17:59:00.014697 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 16 17:59:00.019000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.017353 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 16 17:59:00.020000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.017494 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 16 17:59:00.019348 systemd[1]: ignition-files.service: Deactivated successfully. Jan 16 17:59:00.019461 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 16 17:59:00.026000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.021817 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 16 17:59:00.027000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.023340 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 16 17:59:00.029000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.024324 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 16 17:59:00.024480 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 16 17:59:00.026352 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 16 17:59:00.026467 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 16 17:59:00.028143 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 16 17:59:00.028239 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 16 17:59:00.035489 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 16 17:59:00.037448 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 16 17:59:00.038000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.038000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.042750 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 16 17:59:00.044634 ignition[1157]: INFO : Ignition 2.24.0 Jan 16 17:59:00.044634 ignition[1157]: INFO : Stage: umount Jan 16 17:59:00.046150 ignition[1157]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 16 17:59:00.046150 ignition[1157]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 16 17:59:00.046150 ignition[1157]: INFO : umount: umount passed Jan 16 17:59:00.046150 ignition[1157]: INFO : Ignition finished successfully Jan 16 17:59:00.047883 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 16 17:59:00.050000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.048933 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 16 17:59:00.051000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.050776 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 16 17:59:00.054000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.050860 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 16 17:59:00.056000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.052571 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 16 17:59:00.057000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.052655 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 16 17:59:00.054616 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 16 17:59:00.061000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.054662 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 16 17:59:00.056896 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 16 17:59:00.056942 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 16 17:59:00.058344 systemd[1]: Stopped target network.target - Network. Jan 16 17:59:00.059806 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 16 17:59:00.059857 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 16 17:59:00.061508 systemd[1]: Stopped target paths.target - Path Units. Jan 16 17:59:00.062974 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 16 17:59:00.066564 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 16 17:59:00.068144 systemd[1]: Stopped target slices.target - Slice Units. Jan 16 17:59:00.069820 systemd[1]: Stopped target sockets.target - Socket Units. Jan 16 17:59:00.071285 systemd[1]: iscsid.socket: Deactivated successfully. Jan 16 17:59:00.077000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.071325 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 16 17:59:00.079000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.072970 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 16 17:59:00.080000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.073003 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 16 17:59:00.074995 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 16 17:59:00.075018 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 16 17:59:00.076498 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 16 17:59:00.076549 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 16 17:59:00.078105 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 16 17:59:00.078147 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 16 17:59:00.079564 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 16 17:59:00.079612 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 16 17:59:00.081264 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 16 17:59:00.082731 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 16 17:59:00.095977 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 16 17:59:00.097000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.096078 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 16 17:59:00.099819 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 16 17:59:00.099937 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 16 17:59:00.101000 audit: BPF prog-id=6 op=UNLOAD Jan 16 17:59:00.103000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.105961 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 16 17:59:00.107137 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 16 17:59:00.107186 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 16 17:59:00.109762 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 16 17:59:00.111364 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 16 17:59:00.113000 audit: BPF prog-id=9 op=UNLOAD Jan 16 17:59:00.113000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.111438 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 16 17:59:00.115000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.113488 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 16 17:59:00.117000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.113530 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 16 17:59:00.115217 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 16 17:59:00.115258 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 16 17:59:00.117349 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 16 17:59:00.144006 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 16 17:59:00.144169 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 16 17:59:00.145000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.146349 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 16 17:59:00.146385 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 16 17:59:00.148125 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 16 17:59:00.150000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.148154 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 16 17:59:00.149809 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 16 17:59:00.154000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.149858 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 16 17:59:00.152284 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 16 17:59:00.156000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.152335 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 16 17:59:00.154972 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 16 17:59:00.155023 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 16 17:59:00.166347 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 16 17:59:00.167449 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 16 17:59:00.167531 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 16 17:59:00.169000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.170367 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 16 17:59:00.172000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.170415 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 16 17:59:00.174000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.172600 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 16 17:59:00.176000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.172640 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 16 17:59:00.178000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.174546 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 16 17:59:00.174589 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 16 17:59:00.176548 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 16 17:59:00.182000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.176592 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 16 17:59:00.184000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.184000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:00.179042 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 16 17:59:00.179151 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 16 17:59:00.182638 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 16 17:59:00.182707 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 16 17:59:00.184999 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 16 17:59:00.188756 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 16 17:59:00.204810 systemd[1]: Switching root. Jan 16 17:59:00.233338 systemd-journald[415]: Journal stopped Jan 16 17:59:01.089088 systemd-journald[415]: Received SIGTERM from PID 1 (systemd). Jan 16 17:59:01.089165 kernel: SELinux: policy capability network_peer_controls=1 Jan 16 17:59:01.089181 kernel: SELinux: policy capability open_perms=1 Jan 16 17:59:01.089193 kernel: SELinux: policy capability extended_socket_class=1 Jan 16 17:59:01.089208 kernel: SELinux: policy capability always_check_network=0 Jan 16 17:59:01.089220 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 16 17:59:01.089231 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 16 17:59:01.089245 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 16 17:59:01.089256 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 16 17:59:01.089266 kernel: SELinux: policy capability userspace_initial_context=0 Jan 16 17:59:01.089276 systemd[1]: Successfully loaded SELinux policy in 63.363ms. Jan 16 17:59:01.089299 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.661ms. Jan 16 17:59:01.089311 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 16 17:59:01.089325 systemd[1]: Detected virtualization kvm. Jan 16 17:59:01.089336 systemd[1]: Detected architecture arm64. Jan 16 17:59:01.089346 systemd[1]: Detected first boot. Jan 16 17:59:01.089357 systemd[1]: Hostname set to . Jan 16 17:59:01.089369 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 16 17:59:01.089379 zram_generator::config[1203]: No configuration found. Jan 16 17:59:01.089393 kernel: NET: Registered PF_VSOCK protocol family Jan 16 17:59:01.089405 systemd[1]: Populated /etc with preset unit settings. Jan 16 17:59:01.089417 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 16 17:59:01.090066 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 16 17:59:01.090086 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 16 17:59:01.090101 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 16 17:59:01.090113 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 16 17:59:01.090124 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 16 17:59:01.090136 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 16 17:59:01.090148 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 16 17:59:01.090159 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 16 17:59:01.090174 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 16 17:59:01.090187 systemd[1]: Created slice user.slice - User and Session Slice. Jan 16 17:59:01.090198 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 16 17:59:01.090210 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 16 17:59:01.090221 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 16 17:59:01.090232 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 16 17:59:01.090243 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 16 17:59:01.090255 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 16 17:59:01.090267 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jan 16 17:59:01.090281 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 16 17:59:01.090293 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 16 17:59:01.090304 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 16 17:59:01.090315 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 16 17:59:01.090327 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 16 17:59:01.090338 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 16 17:59:01.090350 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 16 17:59:01.090361 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 16 17:59:01.090372 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 16 17:59:01.090383 systemd[1]: Reached target slices.target - Slice Units. Jan 16 17:59:01.090394 systemd[1]: Reached target swap.target - Swaps. Jan 16 17:59:01.090442 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 16 17:59:01.090455 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 16 17:59:01.090466 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 16 17:59:01.090481 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 16 17:59:01.090491 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 16 17:59:01.090502 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 16 17:59:01.090514 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 16 17:59:01.090528 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 16 17:59:01.090542 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 16 17:59:01.090553 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 16 17:59:01.090565 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 16 17:59:01.090576 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 16 17:59:01.090587 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 16 17:59:01.090598 systemd[1]: Mounting media.mount - External Media Directory... Jan 16 17:59:01.090611 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 16 17:59:01.090622 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 16 17:59:01.090633 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 16 17:59:01.090645 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 16 17:59:01.090656 systemd[1]: Reached target machines.target - Containers. Jan 16 17:59:01.090667 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 16 17:59:01.090677 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 16 17:59:01.090690 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 16 17:59:01.090701 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 16 17:59:01.090711 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 16 17:59:01.090722 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 16 17:59:01.090734 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 16 17:59:01.090744 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 16 17:59:01.090755 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 16 17:59:01.090770 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 16 17:59:01.090780 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 16 17:59:01.090793 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 16 17:59:01.090805 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 16 17:59:01.090818 systemd[1]: Stopped systemd-fsck-usr.service. Jan 16 17:59:01.090829 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 16 17:59:01.090840 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 16 17:59:01.090851 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 16 17:59:01.090862 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 16 17:59:01.090873 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 16 17:59:01.090885 kernel: ACPI: bus type drm_connector registered Jan 16 17:59:01.090896 kernel: fuse: init (API version 7.41) Jan 16 17:59:01.090906 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 16 17:59:01.090917 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 16 17:59:01.090951 systemd-journald[1270]: Collecting audit messages is enabled. Jan 16 17:59:01.090983 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 16 17:59:01.090996 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 16 17:59:01.091007 systemd-journald[1270]: Journal started Jan 16 17:59:01.091028 systemd-journald[1270]: Runtime Journal (/run/log/journal/3c7eb63aaaff442f99706db93d911aec) is 8M, max 319.5M, 311.5M free. Jan 16 17:59:01.040000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.042000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.045000 audit: BPF prog-id=14 op=UNLOAD Jan 16 17:59:01.045000 audit: BPF prog-id=13 op=UNLOAD Jan 16 17:59:01.046000 audit: BPF prog-id=15 op=LOAD Jan 16 17:59:01.046000 audit: BPF prog-id=16 op=LOAD Jan 16 17:59:01.046000 audit: BPF prog-id=17 op=LOAD Jan 16 17:59:01.086000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 16 17:59:01.086000 audit[1270]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=4 a1=ffffccd71190 a2=4000 a3=0 items=0 ppid=1 pid=1270 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.086000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 16 17:59:00.852401 systemd[1]: Queued start job for default target multi-user.target. Jan 16 17:59:00.872730 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 16 17:59:00.873141 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 16 17:59:01.094437 systemd[1]: Started systemd-journald.service - Journal Service. Jan 16 17:59:01.094000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.095329 systemd[1]: Mounted media.mount - External Media Directory. Jan 16 17:59:01.096548 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 16 17:59:01.097924 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 16 17:59:01.099162 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 16 17:59:01.101494 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 16 17:59:01.102000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.102974 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 16 17:59:01.103124 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 16 17:59:01.103000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.103000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.104733 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 16 17:59:01.104895 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 16 17:59:01.105000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.105000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.106237 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 16 17:59:01.106401 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 16 17:59:01.106000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.106000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.107825 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 16 17:59:01.108000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.109149 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 16 17:59:01.109303 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 16 17:59:01.110000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.110000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.110866 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 16 17:59:01.111026 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 16 17:59:01.111000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.111000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.112283 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 16 17:59:01.112594 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 16 17:59:01.113000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.113000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.113900 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 16 17:59:01.114000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.115363 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 16 17:59:01.116000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.117453 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 16 17:59:01.118000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.119004 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 16 17:59:01.119000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.130147 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 16 17:59:01.131989 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 16 17:59:01.133183 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 16 17:59:01.133216 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 16 17:59:01.135120 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 16 17:59:01.136462 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 16 17:59:01.136568 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 16 17:59:01.138339 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 16 17:59:01.140287 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 16 17:59:01.141571 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 16 17:59:01.142390 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 16 17:59:01.143487 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 16 17:59:01.151600 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 16 17:59:01.153641 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 16 17:59:01.157521 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 16 17:59:01.158535 systemd-journald[1270]: Time spent on flushing to /var/log/journal/3c7eb63aaaff442f99706db93d911aec is 33.665ms for 1816 entries. Jan 16 17:59:01.158535 systemd-journald[1270]: System Journal (/var/log/journal/3c7eb63aaaff442f99706db93d911aec) is 8M, max 588.1M, 580.1M free. Jan 16 17:59:01.200063 systemd-journald[1270]: Received client request to flush runtime journal. Jan 16 17:59:01.200108 kernel: loop1: detected capacity change from 0 to 1648 Jan 16 17:59:01.160000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.183000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.194000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.159232 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 16 17:59:01.161255 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 16 17:59:01.169601 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 16 17:59:01.182517 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 16 17:59:01.193015 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 16 17:59:01.197759 systemd-tmpfiles[1322]: ACLs are not supported, ignoring. Jan 16 17:59:01.197770 systemd-tmpfiles[1322]: ACLs are not supported, ignoring. Jan 16 17:59:01.200978 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 16 17:59:01.202000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.203873 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 16 17:59:01.206000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.207439 kernel: loop2: detected capacity change from 0 to 100192 Jan 16 17:59:01.210569 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 16 17:59:01.220595 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 16 17:59:01.221000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.246468 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 16 17:59:01.247000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.248000 audit: BPF prog-id=18 op=LOAD Jan 16 17:59:01.248000 audit: BPF prog-id=19 op=LOAD Jan 16 17:59:01.248000 audit: BPF prog-id=20 op=LOAD Jan 16 17:59:01.249307 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 16 17:59:01.251736 kernel: loop3: detected capacity change from 0 to 207008 Jan 16 17:59:01.251000 audit: BPF prog-id=21 op=LOAD Jan 16 17:59:01.255572 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 16 17:59:01.257556 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 16 17:59:01.259000 audit: BPF prog-id=22 op=LOAD Jan 16 17:59:01.259000 audit: BPF prog-id=23 op=LOAD Jan 16 17:59:01.259000 audit: BPF prog-id=24 op=LOAD Jan 16 17:59:01.260330 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 16 17:59:01.268000 audit: BPF prog-id=25 op=LOAD Jan 16 17:59:01.268000 audit: BPF prog-id=26 op=LOAD Jan 16 17:59:01.268000 audit: BPF prog-id=27 op=LOAD Jan 16 17:59:01.269490 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 16 17:59:01.280878 systemd-tmpfiles[1345]: ACLs are not supported, ignoring. Jan 16 17:59:01.280894 systemd-tmpfiles[1345]: ACLs are not supported, ignoring. Jan 16 17:59:01.284569 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 16 17:59:01.285000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.288516 kernel: loop4: detected capacity change from 0 to 45344 Jan 16 17:59:01.302136 systemd-nsresourced[1346]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 16 17:59:01.305335 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 16 17:59:01.306000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.306582 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 16 17:59:01.308000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.327496 kernel: loop5: detected capacity change from 0 to 1648 Jan 16 17:59:01.337452 kernel: loop6: detected capacity change from 0 to 100192 Jan 16 17:59:01.347391 systemd-oomd[1342]: No swap; memory pressure usage will be degraded Jan 16 17:59:01.348000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.347849 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 16 17:59:01.352447 kernel: loop7: detected capacity change from 0 to 207008 Jan 16 17:59:01.367346 systemd-resolved[1344]: Positive Trust Anchors: Jan 16 17:59:01.367364 systemd-resolved[1344]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 16 17:59:01.367401 systemd-resolved[1344]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 16 17:59:01.367485 systemd-resolved[1344]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 16 17:59:01.371445 kernel: loop1: detected capacity change from 0 to 45344 Jan 16 17:59:01.375355 systemd-resolved[1344]: Using system hostname 'ci-4580-0-0-p-7f6b5ebc40'. Jan 16 17:59:01.377442 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 16 17:59:01.378000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.378711 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 16 17:59:01.381104 (sd-merge)[1366]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Jan 16 17:59:01.383857 (sd-merge)[1366]: Merged extensions into '/usr'. Jan 16 17:59:01.387615 systemd[1]: Reload requested from client PID 1321 ('systemd-sysext') (unit systemd-sysext.service)... Jan 16 17:59:01.387635 systemd[1]: Reloading... Jan 16 17:59:01.439448 zram_generator::config[1396]: No configuration found. Jan 16 17:59:01.589878 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 16 17:59:01.590228 systemd[1]: Reloading finished in 202 ms. Jan 16 17:59:01.610572 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 16 17:59:01.611000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.611970 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 16 17:59:01.613000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.628602 systemd[1]: Starting ensure-sysext.service... Jan 16 17:59:01.630235 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 16 17:59:01.631000 audit: BPF prog-id=8 op=UNLOAD Jan 16 17:59:01.631000 audit: BPF prog-id=7 op=UNLOAD Jan 16 17:59:01.631000 audit: BPF prog-id=28 op=LOAD Jan 16 17:59:01.631000 audit: BPF prog-id=29 op=LOAD Jan 16 17:59:01.632547 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 16 17:59:01.634000 audit: BPF prog-id=30 op=LOAD Jan 16 17:59:01.634000 audit: BPF prog-id=21 op=UNLOAD Jan 16 17:59:01.634000 audit: BPF prog-id=31 op=LOAD Jan 16 17:59:01.634000 audit: BPF prog-id=22 op=UNLOAD Jan 16 17:59:01.634000 audit: BPF prog-id=32 op=LOAD Jan 16 17:59:01.634000 audit: BPF prog-id=33 op=LOAD Jan 16 17:59:01.634000 audit: BPF prog-id=23 op=UNLOAD Jan 16 17:59:01.634000 audit: BPF prog-id=24 op=UNLOAD Jan 16 17:59:01.635000 audit: BPF prog-id=34 op=LOAD Jan 16 17:59:01.635000 audit: BPF prog-id=15 op=UNLOAD Jan 16 17:59:01.635000 audit: BPF prog-id=35 op=LOAD Jan 16 17:59:01.635000 audit: BPF prog-id=36 op=LOAD Jan 16 17:59:01.635000 audit: BPF prog-id=16 op=UNLOAD Jan 16 17:59:01.635000 audit: BPF prog-id=17 op=UNLOAD Jan 16 17:59:01.636000 audit: BPF prog-id=37 op=LOAD Jan 16 17:59:01.636000 audit: BPF prog-id=18 op=UNLOAD Jan 16 17:59:01.637000 audit: BPF prog-id=38 op=LOAD Jan 16 17:59:01.637000 audit: BPF prog-id=39 op=LOAD Jan 16 17:59:01.637000 audit: BPF prog-id=19 op=UNLOAD Jan 16 17:59:01.637000 audit: BPF prog-id=20 op=UNLOAD Jan 16 17:59:01.637000 audit: BPF prog-id=40 op=LOAD Jan 16 17:59:01.637000 audit: BPF prog-id=25 op=UNLOAD Jan 16 17:59:01.637000 audit: BPF prog-id=41 op=LOAD Jan 16 17:59:01.637000 audit: BPF prog-id=42 op=LOAD Jan 16 17:59:01.637000 audit: BPF prog-id=26 op=UNLOAD Jan 16 17:59:01.637000 audit: BPF prog-id=27 op=UNLOAD Jan 16 17:59:01.645397 systemd-tmpfiles[1434]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 16 17:59:01.645846 systemd-tmpfiles[1434]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 16 17:59:01.646098 systemd-tmpfiles[1434]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 16 17:59:01.647018 systemd-tmpfiles[1434]: ACLs are not supported, ignoring. Jan 16 17:59:01.647070 systemd-tmpfiles[1434]: ACLs are not supported, ignoring. Jan 16 17:59:01.652810 systemd[1]: Reload requested from client PID 1433 ('systemctl') (unit ensure-sysext.service)... Jan 16 17:59:01.652824 systemd[1]: Reloading... Jan 16 17:59:01.653410 systemd-tmpfiles[1434]: Detected autofs mount point /boot during canonicalization of boot. Jan 16 17:59:01.653434 systemd-tmpfiles[1434]: Skipping /boot Jan 16 17:59:01.659774 systemd-tmpfiles[1434]: Detected autofs mount point /boot during canonicalization of boot. Jan 16 17:59:01.659789 systemd-tmpfiles[1434]: Skipping /boot Jan 16 17:59:01.668862 systemd-udevd[1435]: Using default interface naming scheme 'v257'. Jan 16 17:59:01.707502 zram_generator::config[1465]: No configuration found. Jan 16 17:59:01.789454 kernel: mousedev: PS/2 mouse device common for all mice Jan 16 17:59:01.847357 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Jan 16 17:59:01.847474 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 16 17:59:01.847506 kernel: [drm] features: -context_init Jan 16 17:59:01.849658 kernel: [drm] number of scanouts: 1 Jan 16 17:59:01.849725 kernel: [drm] number of cap sets: 0 Jan 16 17:59:01.852493 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Jan 16 17:59:01.858866 kernel: Console: switching to colour frame buffer device 160x50 Jan 16 17:59:01.864456 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 16 17:59:01.913007 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 16 17:59:01.914678 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jan 16 17:59:01.914762 systemd[1]: Reloading finished in 261 ms. Jan 16 17:59:01.923185 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 16 17:59:01.924000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.926000 audit: BPF prog-id=43 op=LOAD Jan 16 17:59:01.926000 audit: BPF prog-id=30 op=UNLOAD Jan 16 17:59:01.927000 audit: BPF prog-id=44 op=LOAD Jan 16 17:59:01.927000 audit: BPF prog-id=31 op=UNLOAD Jan 16 17:59:01.927000 audit: BPF prog-id=45 op=LOAD Jan 16 17:59:01.927000 audit: BPF prog-id=46 op=LOAD Jan 16 17:59:01.927000 audit: BPF prog-id=32 op=UNLOAD Jan 16 17:59:01.927000 audit: BPF prog-id=33 op=UNLOAD Jan 16 17:59:01.928000 audit: BPF prog-id=47 op=LOAD Jan 16 17:59:01.928000 audit: BPF prog-id=37 op=UNLOAD Jan 16 17:59:01.928000 audit: BPF prog-id=48 op=LOAD Jan 16 17:59:01.928000 audit: BPF prog-id=49 op=LOAD Jan 16 17:59:01.928000 audit: BPF prog-id=38 op=UNLOAD Jan 16 17:59:01.928000 audit: BPF prog-id=39 op=UNLOAD Jan 16 17:59:01.928000 audit: BPF prog-id=50 op=LOAD Jan 16 17:59:01.928000 audit: BPF prog-id=34 op=UNLOAD Jan 16 17:59:01.928000 audit: BPF prog-id=51 op=LOAD Jan 16 17:59:01.928000 audit: BPF prog-id=52 op=LOAD Jan 16 17:59:01.929000 audit: BPF prog-id=35 op=UNLOAD Jan 16 17:59:01.929000 audit: BPF prog-id=36 op=UNLOAD Jan 16 17:59:01.929000 audit: BPF prog-id=53 op=LOAD Jan 16 17:59:01.929000 audit: BPF prog-id=54 op=LOAD Jan 16 17:59:01.929000 audit: BPF prog-id=28 op=UNLOAD Jan 16 17:59:01.929000 audit: BPF prog-id=29 op=UNLOAD Jan 16 17:59:01.929000 audit: BPF prog-id=55 op=LOAD Jan 16 17:59:01.929000 audit: BPF prog-id=40 op=UNLOAD Jan 16 17:59:01.929000 audit: BPF prog-id=56 op=LOAD Jan 16 17:59:01.929000 audit: BPF prog-id=57 op=LOAD Jan 16 17:59:01.929000 audit: BPF prog-id=41 op=UNLOAD Jan 16 17:59:01.929000 audit: BPF prog-id=42 op=UNLOAD Jan 16 17:59:01.938620 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 16 17:59:01.939000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.962523 systemd[1]: Finished ensure-sysext.service. Jan 16 17:59:01.962000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:01.978466 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 16 17:59:01.980254 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 16 17:59:01.981561 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 16 17:59:01.982467 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 16 17:59:01.990543 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 16 17:59:01.992534 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 16 17:59:01.994296 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 16 17:59:01.996299 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 16 17:59:02.003292 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 16 17:59:02.005470 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 16 17:59:02.006720 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 16 17:59:02.006818 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 16 17:59:02.010065 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 16 17:59:02.012028 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 16 17:59:02.014504 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 16 17:59:02.015531 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 16 17:59:02.017000 audit: BPF prog-id=58 op=LOAD Jan 16 17:59:02.018688 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 16 17:59:02.019782 systemd[1]: Reached target time-set.target - System Time Set. Jan 16 17:59:02.022275 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 16 17:59:02.024900 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 16 17:59:02.024968 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 16 17:59:02.025989 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 16 17:59:02.028739 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 16 17:59:02.030461 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 16 17:59:02.031435 kernel: PTP clock support registered Jan 16 17:59:02.031000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.031000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.032060 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 16 17:59:02.032232 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 16 17:59:02.033000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.033000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.033762 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 16 17:59:02.033941 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 16 17:59:02.034000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.034000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.035326 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 16 17:59:02.035518 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 16 17:59:02.037000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.037000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.037854 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 16 17:59:02.038038 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 16 17:59:02.037000 audit[1575]: SYSTEM_BOOT pid=1575 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.039000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.039000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.039882 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 16 17:59:02.040058 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 16 17:59:02.041000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.041000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.041757 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 16 17:59:02.041936 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 16 17:59:02.043000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.043000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.044652 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 16 17:59:02.045000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.061977 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 16 17:59:02.065646 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 16 17:59:02.067332 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 16 17:59:02.067396 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 16 17:59:02.072626 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 16 17:59:02.074000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.078161 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 16 17:59:02.079000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:02.080002 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 16 17:59:02.081631 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 16 17:59:02.092000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 16 17:59:02.092000 audit[1610]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=fffffbc1df20 a2=420 a3=0 items=0 ppid=1556 pid=1610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.092000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 16 17:59:02.093076 augenrules[1610]: No rules Jan 16 17:59:02.096366 systemd[1]: audit-rules.service: Deactivated successfully. Jan 16 17:59:02.096742 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 16 17:59:02.116134 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 16 17:59:02.118172 systemd-networkd[1574]: lo: Link UP Jan 16 17:59:02.118414 systemd-networkd[1574]: lo: Gained carrier Jan 16 17:59:02.119745 systemd-networkd[1574]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 16 17:59:02.119827 systemd-networkd[1574]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 16 17:59:02.119900 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 16 17:59:02.120939 systemd-networkd[1574]: eth0: Link UP Jan 16 17:59:02.121083 systemd-networkd[1574]: eth0: Gained carrier Jan 16 17:59:02.121095 systemd-networkd[1574]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 16 17:59:02.121923 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 16 17:59:02.125043 systemd[1]: Reached target network.target - Network. Jan 16 17:59:02.127394 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 16 17:59:02.129665 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 16 17:59:02.130827 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 16 17:59:02.132474 systemd-networkd[1574]: eth0: DHCPv4 address 10.0.7.62/25, gateway 10.0.7.1 acquired from 10.0.7.1 Jan 16 17:59:02.153102 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 16 17:59:02.468502 ldconfig[1566]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 16 17:59:02.472399 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 16 17:59:02.474860 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 16 17:59:02.495302 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 16 17:59:02.496644 systemd[1]: Reached target sysinit.target - System Initialization. Jan 16 17:59:02.497754 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 16 17:59:02.498922 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 16 17:59:02.500233 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 16 17:59:02.501390 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 16 17:59:02.502659 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 16 17:59:02.503852 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 16 17:59:02.504868 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 16 17:59:02.505996 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 16 17:59:02.506035 systemd[1]: Reached target paths.target - Path Units. Jan 16 17:59:02.506883 systemd[1]: Reached target timers.target - Timer Units. Jan 16 17:59:02.508202 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 16 17:59:02.510517 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 16 17:59:02.513196 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 16 17:59:02.514570 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 16 17:59:02.515673 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 16 17:59:02.520397 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 16 17:59:02.521607 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 16 17:59:02.523150 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 16 17:59:02.524230 systemd[1]: Reached target sockets.target - Socket Units. Jan 16 17:59:02.525151 systemd[1]: Reached target basic.target - Basic System. Jan 16 17:59:02.526064 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 16 17:59:02.526098 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 16 17:59:02.528596 systemd[1]: Starting chronyd.service - NTP client/server... Jan 16 17:59:02.530202 systemd[1]: Starting containerd.service - containerd container runtime... Jan 16 17:59:02.532237 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 16 17:59:02.534138 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 16 17:59:02.537572 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 16 17:59:02.539557 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 16 17:59:02.540440 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 17:59:02.542469 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 16 17:59:02.543364 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 16 17:59:02.546177 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 16 17:59:02.549599 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 16 17:59:02.550853 jq[1634]: false Jan 16 17:59:02.552506 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 16 17:59:02.554838 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 16 17:59:02.561181 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 16 17:59:02.562160 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 16 17:59:02.562614 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 16 17:59:02.563684 systemd[1]: Starting update-engine.service - Update Engine... Jan 16 17:59:02.566067 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 16 17:59:02.566290 extend-filesystems[1636]: Found /dev/vda6 Jan 16 17:59:02.571562 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 16 17:59:02.573184 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 16 17:59:02.573407 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 16 17:59:02.575372 extend-filesystems[1636]: Found /dev/vda9 Jan 16 17:59:02.575689 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 16 17:59:02.575870 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 16 17:59:02.578917 chronyd[1628]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 16 17:59:02.580949 chronyd[1628]: Loaded seccomp filter (level 2) Jan 16 17:59:02.581933 extend-filesystems[1636]: Checking size of /dev/vda9 Jan 16 17:59:02.583124 systemd[1]: Started chronyd.service - NTP client/server. Jan 16 17:59:02.590179 jq[1647]: true Jan 16 17:59:02.601396 tar[1653]: linux-arm64/LICENSE Jan 16 17:59:02.601728 tar[1653]: linux-arm64/helm Jan 16 17:59:02.608865 systemd[1]: motdgen.service: Deactivated successfully. Jan 16 17:59:02.611673 update_engine[1645]: I20260116 17:59:02.607745 1645 main.cc:92] Flatcar Update Engine starting Jan 16 17:59:02.609124 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 16 17:59:02.615788 jq[1675]: true Jan 16 17:59:02.617429 extend-filesystems[1636]: Resized partition /dev/vda9 Jan 16 17:59:02.623016 extend-filesystems[1684]: resize2fs 1.47.3 (8-Jul-2025) Jan 16 17:59:02.630544 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Jan 16 17:59:02.638899 dbus-daemon[1631]: [system] SELinux support is enabled Jan 16 17:59:02.639123 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 16 17:59:02.645080 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 16 17:59:02.645109 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 16 17:59:02.646488 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 16 17:59:02.646512 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 16 17:59:02.660580 systemd[1]: Started update-engine.service - Update Engine. Jan 16 17:59:02.661677 update_engine[1645]: I20260116 17:59:02.661329 1645 update_check_scheduler.cc:74] Next update check in 4m32s Jan 16 17:59:02.666164 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 16 17:59:02.682338 systemd-logind[1644]: New seat seat0. Jan 16 17:59:02.723998 locksmithd[1701]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 16 17:59:02.750132 systemd-logind[1644]: Watching system buttons on /dev/input/event0 (Power Button) Jan 16 17:59:02.750149 systemd-logind[1644]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jan 16 17:59:02.750410 systemd[1]: Started systemd-logind.service - User Login Management. Jan 16 17:59:02.771737 bash[1700]: Updated "/home/core/.ssh/authorized_keys" Jan 16 17:59:02.776497 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 16 17:59:02.781344 systemd[1]: Starting sshkeys.service... Jan 16 17:59:02.785978 containerd[1658]: time="2026-01-16T17:59:02Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 16 17:59:02.787033 containerd[1658]: time="2026-01-16T17:59:02.786976120Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 16 17:59:02.801164 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 16 17:59:02.805714 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 16 17:59:02.809806 containerd[1658]: time="2026-01-16T17:59:02.809677680Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.8µs" Jan 16 17:59:02.810091 containerd[1658]: time="2026-01-16T17:59:02.810068040Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 16 17:59:02.810432 containerd[1658]: time="2026-01-16T17:59:02.810397200Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 16 17:59:02.810708 containerd[1658]: time="2026-01-16T17:59:02.810559240Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 16 17:59:02.810910 containerd[1658]: time="2026-01-16T17:59:02.810888720Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 16 17:59:02.810975 containerd[1658]: time="2026-01-16T17:59:02.810961880Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 16 17:59:02.811189 containerd[1658]: time="2026-01-16T17:59:02.811159240Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 16 17:59:02.811862 containerd[1658]: time="2026-01-16T17:59:02.811281040Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 16 17:59:02.811862 containerd[1658]: time="2026-01-16T17:59:02.811645240Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 16 17:59:02.811862 containerd[1658]: time="2026-01-16T17:59:02.811661520Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 16 17:59:02.811862 containerd[1658]: time="2026-01-16T17:59:02.811673120Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 16 17:59:02.811862 containerd[1658]: time="2026-01-16T17:59:02.811681200Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 16 17:59:02.812139 containerd[1658]: time="2026-01-16T17:59:02.812117280Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 16 17:59:02.812255 containerd[1658]: time="2026-01-16T17:59:02.812239000Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 16 17:59:02.812509 containerd[1658]: time="2026-01-16T17:59:02.812487640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 16 17:59:02.813071 containerd[1658]: time="2026-01-16T17:59:02.813006360Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 16 17:59:02.813214 containerd[1658]: time="2026-01-16T17:59:02.813197280Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 16 17:59:02.813373 containerd[1658]: time="2026-01-16T17:59:02.813354600Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 16 17:59:02.813531 containerd[1658]: time="2026-01-16T17:59:02.813512600Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 16 17:59:02.813941 containerd[1658]: time="2026-01-16T17:59:02.813885160Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 16 17:59:02.814227 containerd[1658]: time="2026-01-16T17:59:02.814204600Z" level=info msg="metadata content store policy set" policy=shared Jan 16 17:59:02.828527 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 17:59:02.840921 containerd[1658]: time="2026-01-16T17:59:02.840887320Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 16 17:59:02.841321 containerd[1658]: time="2026-01-16T17:59:02.841287120Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 16 17:59:02.841696 containerd[1658]: time="2026-01-16T17:59:02.841661000Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 16 17:59:02.842146 containerd[1658]: time="2026-01-16T17:59:02.842000400Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 16 17:59:02.842146 containerd[1658]: time="2026-01-16T17:59:02.842028320Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 16 17:59:02.842146 containerd[1658]: time="2026-01-16T17:59:02.842041520Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 16 17:59:02.842146 containerd[1658]: time="2026-01-16T17:59:02.842052800Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 16 17:59:02.842146 containerd[1658]: time="2026-01-16T17:59:02.842074720Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 16 17:59:02.842146 containerd[1658]: time="2026-01-16T17:59:02.842092520Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 16 17:59:02.842146 containerd[1658]: time="2026-01-16T17:59:02.842107200Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 16 17:59:02.842146 containerd[1658]: time="2026-01-16T17:59:02.842128600Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 16 17:59:02.842637 containerd[1658]: time="2026-01-16T17:59:02.842353320Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 16 17:59:02.842637 containerd[1658]: time="2026-01-16T17:59:02.842376200Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 16 17:59:02.842637 containerd[1658]: time="2026-01-16T17:59:02.842389440Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 16 17:59:02.842904 containerd[1658]: time="2026-01-16T17:59:02.842883440Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 16 17:59:02.843104 containerd[1658]: time="2026-01-16T17:59:02.843078720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 16 17:59:02.843266 containerd[1658]: time="2026-01-16T17:59:02.843161880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 16 17:59:02.843266 containerd[1658]: time="2026-01-16T17:59:02.843176920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 16 17:59:02.843266 containerd[1658]: time="2026-01-16T17:59:02.843187200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 16 17:59:02.843266 containerd[1658]: time="2026-01-16T17:59:02.843196440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 16 17:59:02.843460 containerd[1658]: time="2026-01-16T17:59:02.843362280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 16 17:59:02.843460 containerd[1658]: time="2026-01-16T17:59:02.843386000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 16 17:59:02.843460 containerd[1658]: time="2026-01-16T17:59:02.843398440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 16 17:59:02.843547 containerd[1658]: time="2026-01-16T17:59:02.843531640Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 16 17:59:02.843596 containerd[1658]: time="2026-01-16T17:59:02.843584960Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 16 17:59:02.843673 containerd[1658]: time="2026-01-16T17:59:02.843661200Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 16 17:59:02.843872 containerd[1658]: time="2026-01-16T17:59:02.843813080Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 16 17:59:02.843872 containerd[1658]: time="2026-01-16T17:59:02.843835840Z" level=info msg="Start snapshots syncer" Jan 16 17:59:02.844037 containerd[1658]: time="2026-01-16T17:59:02.843957080Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 16 17:59:02.844436 containerd[1658]: time="2026-01-16T17:59:02.844352360Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 16 17:59:02.844682 containerd[1658]: time="2026-01-16T17:59:02.844414480Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 16 17:59:02.844682 containerd[1658]: time="2026-01-16T17:59:02.844604960Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 16 17:59:02.845070 containerd[1658]: time="2026-01-16T17:59:02.844969560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 16 17:59:02.845070 containerd[1658]: time="2026-01-16T17:59:02.845013200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 16 17:59:02.845070 containerd[1658]: time="2026-01-16T17:59:02.845024280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 16 17:59:02.845070 containerd[1658]: time="2026-01-16T17:59:02.845044000Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 16 17:59:02.845292 containerd[1658]: time="2026-01-16T17:59:02.845055200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 16 17:59:02.845292 containerd[1658]: time="2026-01-16T17:59:02.845231080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 16 17:59:02.845292 containerd[1658]: time="2026-01-16T17:59:02.845246160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 16 17:59:02.845292 containerd[1658]: time="2026-01-16T17:59:02.845271800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 16 17:59:02.845478 containerd[1658]: time="2026-01-16T17:59:02.845433000Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 16 17:59:02.845542 containerd[1658]: time="2026-01-16T17:59:02.845527960Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 16 17:59:02.845836 containerd[1658]: time="2026-01-16T17:59:02.845768600Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 16 17:59:02.845836 containerd[1658]: time="2026-01-16T17:59:02.845786920Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 16 17:59:02.845836 containerd[1658]: time="2026-01-16T17:59:02.845796480Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 16 17:59:02.845836 containerd[1658]: time="2026-01-16T17:59:02.845804280Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 16 17:59:02.845836 containerd[1658]: time="2026-01-16T17:59:02.845816840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 16 17:59:02.846099 containerd[1658]: time="2026-01-16T17:59:02.845979040Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 16 17:59:02.846210 containerd[1658]: time="2026-01-16T17:59:02.846139560Z" level=info msg="runtime interface created" Jan 16 17:59:02.846210 containerd[1658]: time="2026-01-16T17:59:02.846152040Z" level=info msg="created NRI interface" Jan 16 17:59:02.846210 containerd[1658]: time="2026-01-16T17:59:02.846162760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 16 17:59:02.846210 containerd[1658]: time="2026-01-16T17:59:02.846174640Z" level=info msg="Connect containerd service" Jan 16 17:59:02.846436 containerd[1658]: time="2026-01-16T17:59:02.846372280Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 16 17:59:02.847726 containerd[1658]: time="2026-01-16T17:59:02.847701960Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 16 17:59:02.942014 containerd[1658]: time="2026-01-16T17:59:02.941952760Z" level=info msg="Start subscribing containerd event" Jan 16 17:59:02.942213 containerd[1658]: time="2026-01-16T17:59:02.942196600Z" level=info msg="Start recovering state" Jan 16 17:59:02.942348 containerd[1658]: time="2026-01-16T17:59:02.942333360Z" level=info msg="Start event monitor" Jan 16 17:59:02.942409 containerd[1658]: time="2026-01-16T17:59:02.942396880Z" level=info msg="Start cni network conf syncer for default" Jan 16 17:59:02.942506 containerd[1658]: time="2026-01-16T17:59:02.942492480Z" level=info msg="Start streaming server" Jan 16 17:59:02.942557 containerd[1658]: time="2026-01-16T17:59:02.942547200Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 16 17:59:02.942977 containerd[1658]: time="2026-01-16T17:59:02.942588640Z" level=info msg="runtime interface starting up..." Jan 16 17:59:02.942977 containerd[1658]: time="2026-01-16T17:59:02.942599840Z" level=info msg="starting plugins..." Jan 16 17:59:02.942977 containerd[1658]: time="2026-01-16T17:59:02.942617840Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 16 17:59:02.943161 containerd[1658]: time="2026-01-16T17:59:02.943141160Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 16 17:59:02.943244 containerd[1658]: time="2026-01-16T17:59:02.943231840Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 16 17:59:02.943356 containerd[1658]: time="2026-01-16T17:59:02.943342880Z" level=info msg="containerd successfully booted in 0.157721s" Jan 16 17:59:02.943549 systemd[1]: Started containerd.service - containerd container runtime. Jan 16 17:59:02.978453 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Jan 16 17:59:03.001662 extend-filesystems[1684]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 16 17:59:03.001662 extend-filesystems[1684]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 16 17:59:03.001662 extend-filesystems[1684]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Jan 16 17:59:03.005416 extend-filesystems[1636]: Resized filesystem in /dev/vda9 Jan 16 17:59:03.005176 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 16 17:59:03.005440 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 16 17:59:03.070968 tar[1653]: linux-arm64/README.md Jan 16 17:59:03.086813 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 16 17:59:03.562427 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 17:59:03.628017 sshd_keygen[1655]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 16 17:59:03.648505 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 16 17:59:03.651188 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 16 17:59:03.680894 systemd[1]: issuegen.service: Deactivated successfully. Jan 16 17:59:03.681136 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 16 17:59:03.683826 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 16 17:59:03.710212 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 16 17:59:03.713062 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 16 17:59:03.715226 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jan 16 17:59:03.716643 systemd[1]: Reached target getty.target - Login Prompts. Jan 16 17:59:03.842495 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 17:59:03.858634 systemd-networkd[1574]: eth0: Gained IPv6LL Jan 16 17:59:03.866659 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 16 17:59:03.868444 systemd[1]: Reached target network-online.target - Network is Online. Jan 16 17:59:03.870811 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:59:03.872830 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 16 17:59:03.903125 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 16 17:59:04.737674 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:59:04.754170 (kubelet)[1773]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 16 17:59:05.258602 kubelet[1773]: E0116 17:59:05.258559 1773 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 16 17:59:05.261043 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 16 17:59:05.261174 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 16 17:59:05.261538 systemd[1]: kubelet.service: Consumed 767ms CPU time, 255.8M memory peak. Jan 16 17:59:05.570483 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 17:59:05.854484 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 17:59:09.578461 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 17:59:09.585193 coreos-metadata[1630]: Jan 16 17:59:09.585 WARN failed to locate config-drive, using the metadata service API instead Jan 16 17:59:09.603655 coreos-metadata[1630]: Jan 16 17:59:09.603 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 16 17:59:09.867489 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 16 17:59:09.872214 coreos-metadata[1717]: Jan 16 17:59:09.872 WARN failed to locate config-drive, using the metadata service API instead Jan 16 17:59:09.885135 coreos-metadata[1717]: Jan 16 17:59:09.885 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 16 17:59:10.320827 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 16 17:59:10.322046 systemd[1]: Started sshd@0-10.0.7.62:22-4.153.228.146:52250.service - OpenSSH per-connection server daemon (4.153.228.146:52250). Jan 16 17:59:10.870399 sshd[1792]: Accepted publickey for core from 4.153.228.146 port 52250 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 17:59:10.873632 sshd-session[1792]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:59:10.880085 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 16 17:59:10.881129 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 16 17:59:10.885895 systemd-logind[1644]: New session 1 of user core. Jan 16 17:59:10.912599 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 16 17:59:10.914984 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 16 17:59:10.930606 (systemd)[1798]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:59:10.934368 systemd-logind[1644]: New session 2 of user core. Jan 16 17:59:11.017787 coreos-metadata[1630]: Jan 16 17:59:11.017 INFO Fetch successful Jan 16 17:59:11.017787 coreos-metadata[1630]: Jan 16 17:59:11.017 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 16 17:59:11.038890 systemd[1798]: Queued start job for default target default.target. Jan 16 17:59:11.046503 systemd[1798]: Created slice app.slice - User Application Slice. Jan 16 17:59:11.046536 systemd[1798]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 16 17:59:11.046548 systemd[1798]: Reached target paths.target - Paths. Jan 16 17:59:11.046597 systemd[1798]: Reached target timers.target - Timers. Jan 16 17:59:11.047749 systemd[1798]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 16 17:59:11.048501 systemd[1798]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 16 17:59:11.057749 systemd[1798]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 16 17:59:11.057976 systemd[1798]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 16 17:59:11.058036 systemd[1798]: Reached target sockets.target - Sockets. Jan 16 17:59:11.058072 systemd[1798]: Reached target basic.target - Basic System. Jan 16 17:59:11.058099 systemd[1798]: Reached target default.target - Main User Target. Jan 16 17:59:11.058123 systemd[1798]: Startup finished in 118ms. Jan 16 17:59:11.058610 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 16 17:59:11.066959 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 16 17:59:11.369816 systemd[1]: Started sshd@1-10.0.7.62:22-4.153.228.146:52254.service - OpenSSH per-connection server daemon (4.153.228.146:52254). Jan 16 17:59:11.581336 coreos-metadata[1717]: Jan 16 17:59:11.581 INFO Fetch successful Jan 16 17:59:11.581336 coreos-metadata[1717]: Jan 16 17:59:11.581 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 16 17:59:11.888069 sshd[1812]: Accepted publickey for core from 4.153.228.146 port 52254 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 17:59:11.889470 sshd-session[1812]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:59:11.895474 systemd-logind[1644]: New session 3 of user core. Jan 16 17:59:11.899786 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 16 17:59:12.181351 sshd[1816]: Connection closed by 4.153.228.146 port 52254 Jan 16 17:59:12.181660 sshd-session[1812]: pam_unix(sshd:session): session closed for user core Jan 16 17:59:12.185254 systemd[1]: sshd@1-10.0.7.62:22-4.153.228.146:52254.service: Deactivated successfully. Jan 16 17:59:12.187003 systemd[1]: session-3.scope: Deactivated successfully. Jan 16 17:59:12.188482 systemd-logind[1644]: Session 3 logged out. Waiting for processes to exit. Jan 16 17:59:12.190478 systemd-logind[1644]: Removed session 3. Jan 16 17:59:12.214468 coreos-metadata[1630]: Jan 16 17:59:12.214 INFO Fetch successful Jan 16 17:59:12.214742 coreos-metadata[1630]: Jan 16 17:59:12.214 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 16 17:59:12.292460 systemd[1]: Started sshd@2-10.0.7.62:22-4.153.228.146:52268.service - OpenSSH per-connection server daemon (4.153.228.146:52268). Jan 16 17:59:12.819361 coreos-metadata[1717]: Jan 16 17:59:12.819 INFO Fetch successful Jan 16 17:59:12.821328 unknown[1717]: wrote ssh authorized keys file for user: core Jan 16 17:59:12.826950 coreos-metadata[1630]: Jan 16 17:59:12.826 INFO Fetch successful Jan 16 17:59:12.826950 coreos-metadata[1630]: Jan 16 17:59:12.826 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 16 17:59:12.836401 sshd[1822]: Accepted publickey for core from 4.153.228.146 port 52268 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 17:59:12.837802 sshd-session[1822]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:59:12.843306 systemd-logind[1644]: New session 4 of user core. Jan 16 17:59:12.843832 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 16 17:59:12.848899 update-ssh-keys[1826]: Updated "/home/core/.ssh/authorized_keys" Jan 16 17:59:12.850065 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 16 17:59:12.852124 systemd[1]: Finished sshkeys.service. Jan 16 17:59:13.129573 sshd[1828]: Connection closed by 4.153.228.146 port 52268 Jan 16 17:59:13.129869 sshd-session[1822]: pam_unix(sshd:session): session closed for user core Jan 16 17:59:13.133735 systemd[1]: sshd@2-10.0.7.62:22-4.153.228.146:52268.service: Deactivated successfully. Jan 16 17:59:13.136849 systemd[1]: session-4.scope: Deactivated successfully. Jan 16 17:59:13.137972 systemd-logind[1644]: Session 4 logged out. Waiting for processes to exit. Jan 16 17:59:13.138931 systemd-logind[1644]: Removed session 4. Jan 16 17:59:13.440729 coreos-metadata[1630]: Jan 16 17:59:13.440 INFO Fetch successful Jan 16 17:59:13.440729 coreos-metadata[1630]: Jan 16 17:59:13.440 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 16 17:59:15.398087 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 16 17:59:15.399635 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:59:15.455577 coreos-metadata[1630]: Jan 16 17:59:15.455 INFO Fetch successful Jan 16 17:59:15.455577 coreos-metadata[1630]: Jan 16 17:59:15.455 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 16 17:59:15.542369 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:59:15.546382 (kubelet)[1842]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 16 17:59:15.580084 kubelet[1842]: E0116 17:59:15.580023 1842 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 16 17:59:15.583190 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 16 17:59:15.583319 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 16 17:59:15.584551 systemd[1]: kubelet.service: Consumed 140ms CPU time, 107.7M memory peak. Jan 16 17:59:17.193400 coreos-metadata[1630]: Jan 16 17:59:17.193 INFO Fetch successful Jan 16 17:59:17.218465 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 16 17:59:17.219136 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 16 17:59:17.219270 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 16 17:59:17.219402 systemd[1]: Startup finished in 2.640s (kernel) + 16.231s (initrd) + 16.921s (userspace) = 35.793s. Jan 16 17:59:23.241317 systemd[1]: Started sshd@3-10.0.7.62:22-4.153.228.146:40718.service - OpenSSH per-connection server daemon (4.153.228.146:40718). Jan 16 17:59:23.774285 sshd[1857]: Accepted publickey for core from 4.153.228.146 port 40718 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 17:59:23.775679 sshd-session[1857]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:59:23.779980 systemd-logind[1644]: New session 5 of user core. Jan 16 17:59:23.786567 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 16 17:59:24.067481 sshd[1861]: Connection closed by 4.153.228.146 port 40718 Jan 16 17:59:24.067834 sshd-session[1857]: pam_unix(sshd:session): session closed for user core Jan 16 17:59:24.071730 systemd[1]: sshd@3-10.0.7.62:22-4.153.228.146:40718.service: Deactivated successfully. Jan 16 17:59:24.073346 systemd[1]: session-5.scope: Deactivated successfully. Jan 16 17:59:24.075849 systemd-logind[1644]: Session 5 logged out. Waiting for processes to exit. Jan 16 17:59:24.076987 systemd-logind[1644]: Removed session 5. Jan 16 17:59:24.174852 systemd[1]: Started sshd@4-10.0.7.62:22-4.153.228.146:40720.service - OpenSSH per-connection server daemon (4.153.228.146:40720). Jan 16 17:59:24.693956 sshd[1868]: Accepted publickey for core from 4.153.228.146 port 40720 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 17:59:24.695259 sshd-session[1868]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:59:24.699201 systemd-logind[1644]: New session 6 of user core. Jan 16 17:59:24.715792 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 16 17:59:24.983545 sshd[1872]: Connection closed by 4.153.228.146 port 40720 Jan 16 17:59:24.983834 sshd-session[1868]: pam_unix(sshd:session): session closed for user core Jan 16 17:59:24.987782 systemd[1]: sshd@4-10.0.7.62:22-4.153.228.146:40720.service: Deactivated successfully. Jan 16 17:59:24.989284 systemd[1]: session-6.scope: Deactivated successfully. Jan 16 17:59:24.989958 systemd-logind[1644]: Session 6 logged out. Waiting for processes to exit. Jan 16 17:59:24.990752 systemd-logind[1644]: Removed session 6. Jan 16 17:59:25.093621 systemd[1]: Started sshd@5-10.0.7.62:22-4.153.228.146:50136.service - OpenSSH per-connection server daemon (4.153.228.146:50136). Jan 16 17:59:25.612463 sshd[1878]: Accepted publickey for core from 4.153.228.146 port 50136 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 17:59:25.613624 sshd-session[1878]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:59:25.614515 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 16 17:59:25.616165 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:59:25.618939 systemd-logind[1644]: New session 7 of user core. Jan 16 17:59:25.627809 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 16 17:59:25.737309 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:59:25.740932 (kubelet)[1891]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 16 17:59:25.774109 kubelet[1891]: E0116 17:59:25.774050 1891 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 16 17:59:25.776305 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 16 17:59:25.776451 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 16 17:59:25.778709 systemd[1]: kubelet.service: Consumed 136ms CPU time, 107M memory peak. Jan 16 17:59:25.904924 sshd[1885]: Connection closed by 4.153.228.146 port 50136 Jan 16 17:59:25.905668 sshd-session[1878]: pam_unix(sshd:session): session closed for user core Jan 16 17:59:25.909483 systemd[1]: sshd@5-10.0.7.62:22-4.153.228.146:50136.service: Deactivated successfully. Jan 16 17:59:25.911738 systemd[1]: session-7.scope: Deactivated successfully. Jan 16 17:59:25.914774 systemd-logind[1644]: Session 7 logged out. Waiting for processes to exit. Jan 16 17:59:25.915955 systemd-logind[1644]: Removed session 7. Jan 16 17:59:26.016655 systemd[1]: Started sshd@6-10.0.7.62:22-4.153.228.146:50144.service - OpenSSH per-connection server daemon (4.153.228.146:50144). Jan 16 17:59:26.366280 chronyd[1628]: Selected source PHC0 Jan 16 17:59:26.542860 sshd[1904]: Accepted publickey for core from 4.153.228.146 port 50144 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 17:59:26.544074 sshd-session[1904]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:59:26.547645 systemd-logind[1644]: New session 8 of user core. Jan 16 17:59:26.559849 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 16 17:59:26.738052 sudo[1909]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 16 17:59:26.738291 sudo[1909]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 16 17:59:26.750278 sudo[1909]: pam_unix(sudo:session): session closed for user root Jan 16 17:59:26.839490 sshd[1908]: Connection closed by 4.153.228.146 port 50144 Jan 16 17:59:26.839248 sshd-session[1904]: pam_unix(sshd:session): session closed for user core Jan 16 17:59:26.843443 systemd[1]: sshd@6-10.0.7.62:22-4.153.228.146:50144.service: Deactivated successfully. Jan 16 17:59:26.844882 systemd[1]: session-8.scope: Deactivated successfully. Jan 16 17:59:26.845488 systemd-logind[1644]: Session 8 logged out. Waiting for processes to exit. Jan 16 17:59:26.846240 systemd-logind[1644]: Removed session 8. Jan 16 17:59:26.945445 systemd[1]: Started sshd@7-10.0.7.62:22-4.153.228.146:50150.service - OpenSSH per-connection server daemon (4.153.228.146:50150). Jan 16 17:59:27.451899 sshd[1916]: Accepted publickey for core from 4.153.228.146 port 50150 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 17:59:27.453177 sshd-session[1916]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:59:27.456399 systemd-logind[1644]: New session 9 of user core. Jan 16 17:59:27.465549 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 16 17:59:27.640183 sudo[1922]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 16 17:59:27.640439 sudo[1922]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 16 17:59:27.642768 sudo[1922]: pam_unix(sudo:session): session closed for user root Jan 16 17:59:27.647698 sudo[1921]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 16 17:59:27.647920 sudo[1921]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 16 17:59:27.653871 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 16 17:59:27.688585 kernel: kauditd_printk_skb: 192 callbacks suppressed Jan 16 17:59:27.688659 kernel: audit: type=1305 audit(1768586367.685:236): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 16 17:59:27.685000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 16 17:59:27.685000 audit[1946]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffe392ad60 a2=420 a3=0 items=0 ppid=1927 pid=1946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:27.689034 augenrules[1946]: No rules Jan 16 17:59:27.692308 kernel: audit: type=1300 audit(1768586367.685:236): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffe392ad60 a2=420 a3=0 items=0 ppid=1927 pid=1946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:27.690136 systemd[1]: audit-rules.service: Deactivated successfully. Jan 16 17:59:27.691453 sudo[1921]: pam_unix(sudo:session): session closed for user root Jan 16 17:59:27.690390 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 16 17:59:27.685000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 16 17:59:27.694344 kernel: audit: type=1327 audit(1768586367.685:236): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 16 17:59:27.689000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:27.689000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:27.698693 kernel: audit: type=1130 audit(1768586367.689:237): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:27.698730 kernel: audit: type=1131 audit(1768586367.689:238): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:27.690000 audit[1921]: USER_END pid=1921 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:59:27.701271 kernel: audit: type=1106 audit(1768586367.690:239): pid=1921 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:59:27.690000 audit[1921]: CRED_DISP pid=1921 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:59:27.703665 kernel: audit: type=1104 audit(1768586367.690:240): pid=1921 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:59:27.783536 sshd[1920]: Connection closed by 4.153.228.146 port 50150 Jan 16 17:59:27.783575 sshd-session[1916]: pam_unix(sshd:session): session closed for user core Jan 16 17:59:27.784000 audit[1916]: USER_END pid=1916 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 17:59:27.784000 audit[1916]: CRED_DISP pid=1916 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 17:59:27.789610 systemd[1]: sshd@7-10.0.7.62:22-4.153.228.146:50150.service: Deactivated successfully. Jan 16 17:59:27.791300 kernel: audit: type=1106 audit(1768586367.784:241): pid=1916 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 17:59:27.791354 kernel: audit: type=1104 audit(1768586367.784:242): pid=1916 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 17:59:27.791372 kernel: audit: type=1131 audit(1768586367.788:243): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.7.62:22-4.153.228.146:50150 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:27.788000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.7.62:22-4.153.228.146:50150 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:27.791537 systemd[1]: session-9.scope: Deactivated successfully. Jan 16 17:59:27.793912 systemd-logind[1644]: Session 9 logged out. Waiting for processes to exit. Jan 16 17:59:27.794614 systemd-logind[1644]: Removed session 9. Jan 16 17:59:27.884000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.7.62:22-4.153.228.146:50154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:27.885225 systemd[1]: Started sshd@8-10.0.7.62:22-4.153.228.146:50154.service - OpenSSH per-connection server daemon (4.153.228.146:50154). Jan 16 17:59:28.361000 audit[1955]: USER_ACCT pid=1955 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 17:59:28.362855 sshd[1955]: Accepted publickey for core from 4.153.228.146 port 50154 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 17:59:28.362000 audit[1955]: CRED_ACQ pid=1955 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 17:59:28.362000 audit[1955]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdfd0b9e0 a2=3 a3=0 items=0 ppid=1 pid=1955 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:28.362000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 17:59:28.364048 sshd-session[1955]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:59:28.367858 systemd-logind[1644]: New session 10 of user core. Jan 16 17:59:28.380736 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 16 17:59:28.381000 audit[1955]: USER_START pid=1955 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 17:59:28.382000 audit[1959]: CRED_ACQ pid=1959 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 17:59:28.544000 audit[1960]: USER_ACCT pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:59:28.545220 sudo[1960]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 16 17:59:28.544000 audit[1960]: CRED_REFR pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:59:28.544000 audit[1960]: USER_START pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:59:28.545481 sudo[1960]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 16 17:59:28.844172 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 16 17:59:28.862689 (dockerd)[1981]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 16 17:59:29.126455 dockerd[1981]: time="2026-01-16T17:59:29.124634338Z" level=info msg="Starting up" Jan 16 17:59:29.126455 dockerd[1981]: time="2026-01-16T17:59:29.125351686Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 16 17:59:29.134901 dockerd[1981]: time="2026-01-16T17:59:29.134820222Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 16 17:59:29.178648 dockerd[1981]: time="2026-01-16T17:59:29.178470042Z" level=info msg="Loading containers: start." Jan 16 17:59:29.191458 kernel: Initializing XFRM netlink socket Jan 16 17:59:29.236000 audit[2031]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2031 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.236000 audit[2031]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffefe45f00 a2=0 a3=0 items=0 ppid=1981 pid=2031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.236000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 16 17:59:29.237000 audit[2033]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2033 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.237000 audit[2033]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffd5e6b0c0 a2=0 a3=0 items=0 ppid=1981 pid=2033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.237000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 16 17:59:29.238000 audit[2035]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2035 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.238000 audit[2035]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd73cdf50 a2=0 a3=0 items=0 ppid=1981 pid=2035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.238000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 16 17:59:29.240000 audit[2037]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2037 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.240000 audit[2037]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff65168a0 a2=0 a3=0 items=0 ppid=1981 pid=2037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.240000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 16 17:59:29.241000 audit[2039]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2039 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.241000 audit[2039]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffcf8d4870 a2=0 a3=0 items=0 ppid=1981 pid=2039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.241000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 16 17:59:29.244000 audit[2041]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2041 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.244000 audit[2041]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=fffffe42f500 a2=0 a3=0 items=0 ppid=1981 pid=2041 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.244000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 16 17:59:29.245000 audit[2043]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2043 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.245000 audit[2043]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffceba7e80 a2=0 a3=0 items=0 ppid=1981 pid=2043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.245000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 16 17:59:29.247000 audit[2045]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2045 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.247000 audit[2045]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=fffff947aa40 a2=0 a3=0 items=0 ppid=1981 pid=2045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.247000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 16 17:59:29.284000 audit[2048]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2048 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.284000 audit[2048]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=472 a0=3 a1=ffffde1c1000 a2=0 a3=0 items=0 ppid=1981 pid=2048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.284000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 16 17:59:29.286000 audit[2050]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2050 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.286000 audit[2050]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffd715c1c0 a2=0 a3=0 items=0 ppid=1981 pid=2050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.286000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 16 17:59:29.288000 audit[2052]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2052 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.288000 audit[2052]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=fffffccf0cb0 a2=0 a3=0 items=0 ppid=1981 pid=2052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.288000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 16 17:59:29.290000 audit[2054]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2054 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.290000 audit[2054]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffe99016e0 a2=0 a3=0 items=0 ppid=1981 pid=2054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.290000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 16 17:59:29.291000 audit[2056]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2056 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.291000 audit[2056]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffc9c49ee0 a2=0 a3=0 items=0 ppid=1981 pid=2056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.291000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 16 17:59:29.323000 audit[2086]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2086 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:29.323000 audit[2086]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=fffff7801910 a2=0 a3=0 items=0 ppid=1981 pid=2086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.323000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 16 17:59:29.325000 audit[2088]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2088 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:29.325000 audit[2088]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffc70ffae0 a2=0 a3=0 items=0 ppid=1981 pid=2088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.325000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 16 17:59:29.325000 audit[2090]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2090 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:29.325000 audit[2090]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffa72f790 a2=0 a3=0 items=0 ppid=1981 pid=2090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.325000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 16 17:59:29.327000 audit[2092]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2092 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:29.327000 audit[2092]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc7836550 a2=0 a3=0 items=0 ppid=1981 pid=2092 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.327000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 16 17:59:29.329000 audit[2094]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2094 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:29.329000 audit[2094]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffcfa12f80 a2=0 a3=0 items=0 ppid=1981 pid=2094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.329000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 16 17:59:29.330000 audit[2096]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2096 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:29.330000 audit[2096]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffd01ef260 a2=0 a3=0 items=0 ppid=1981 pid=2096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.330000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 16 17:59:29.333000 audit[2098]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2098 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:29.333000 audit[2098]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffc0d84aa0 a2=0 a3=0 items=0 ppid=1981 pid=2098 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.333000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 16 17:59:29.334000 audit[2100]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2100 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:29.334000 audit[2100]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffdd7dd250 a2=0 a3=0 items=0 ppid=1981 pid=2100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.334000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 16 17:59:29.336000 audit[2102]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2102 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:29.336000 audit[2102]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=484 a0=3 a1=ffffd9262860 a2=0 a3=0 items=0 ppid=1981 pid=2102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.336000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 16 17:59:29.337000 audit[2104]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2104 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:29.337000 audit[2104]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffec975860 a2=0 a3=0 items=0 ppid=1981 pid=2104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.337000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 16 17:59:29.339000 audit[2106]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2106 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:29.339000 audit[2106]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=fffff5296760 a2=0 a3=0 items=0 ppid=1981 pid=2106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.339000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 16 17:59:29.340000 audit[2108]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2108 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:29.340000 audit[2108]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffc2046950 a2=0 a3=0 items=0 ppid=1981 pid=2108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.340000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 16 17:59:29.343000 audit[2110]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2110 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:29.343000 audit[2110]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=fffff52d1a50 a2=0 a3=0 items=0 ppid=1981 pid=2110 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.343000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 16 17:59:29.347000 audit[2115]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2115 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.347000 audit[2115]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc5c97740 a2=0 a3=0 items=0 ppid=1981 pid=2115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.347000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 16 17:59:29.347000 audit[2117]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2117 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.347000 audit[2117]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffc8daf920 a2=0 a3=0 items=0 ppid=1981 pid=2117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.347000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 16 17:59:29.349000 audit[2119]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2119 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.349000 audit[2119]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffc9934000 a2=0 a3=0 items=0 ppid=1981 pid=2119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.349000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 16 17:59:29.350000 audit[2121]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2121 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:29.350000 audit[2121]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffdc6ec220 a2=0 a3=0 items=0 ppid=1981 pid=2121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.350000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 16 17:59:29.352000 audit[2123]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2123 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:29.352000 audit[2123]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffd1b95280 a2=0 a3=0 items=0 ppid=1981 pid=2123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.352000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 16 17:59:29.354000 audit[2125]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2125 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:29.354000 audit[2125]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffed29d480 a2=0 a3=0 items=0 ppid=1981 pid=2125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.354000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 16 17:59:29.377000 audit[2132]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2132 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.377000 audit[2132]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=520 a0=3 a1=fffffdc94350 a2=0 a3=0 items=0 ppid=1981 pid=2132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.377000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 16 17:59:29.379000 audit[2134]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2134 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.379000 audit[2134]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=ffffd6cefbf0 a2=0 a3=0 items=0 ppid=1981 pid=2134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.379000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 16 17:59:29.384000 audit[2142]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2142 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.384000 audit[2142]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=300 a0=3 a1=fffff854e0b0 a2=0 a3=0 items=0 ppid=1981 pid=2142 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.384000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 16 17:59:29.394000 audit[2148]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2148 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.394000 audit[2148]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=ffffd31ebcc0 a2=0 a3=0 items=0 ppid=1981 pid=2148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.394000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 16 17:59:29.396000 audit[2150]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2150 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.396000 audit[2150]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=512 a0=3 a1=ffffe80b5420 a2=0 a3=0 items=0 ppid=1981 pid=2150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.396000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 16 17:59:29.398000 audit[2152]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2152 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.398000 audit[2152]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=fffff911d760 a2=0 a3=0 items=0 ppid=1981 pid=2152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.398000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 16 17:59:29.400000 audit[2154]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2154 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.400000 audit[2154]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=ffffc2da6f40 a2=0 a3=0 items=0 ppid=1981 pid=2154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.400000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 16 17:59:29.401000 audit[2156]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2156 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:29.401000 audit[2156]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=fffff2684630 a2=0 a3=0 items=0 ppid=1981 pid=2156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:29.401000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 16 17:59:29.402699 systemd-networkd[1574]: docker0: Link UP Jan 16 17:59:29.407775 dockerd[1981]: time="2026-01-16T17:59:29.407706305Z" level=info msg="Loading containers: done." Jan 16 17:59:29.417872 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1841094013-merged.mount: Deactivated successfully. Jan 16 17:59:29.427667 dockerd[1981]: time="2026-01-16T17:59:29.427579772Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 16 17:59:29.427837 dockerd[1981]: time="2026-01-16T17:59:29.427819829Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 16 17:59:29.428119 dockerd[1981]: time="2026-01-16T17:59:29.428047163Z" level=info msg="Initializing buildkit" Jan 16 17:59:29.448763 dockerd[1981]: time="2026-01-16T17:59:29.448738556Z" level=info msg="Completed buildkit initialization" Jan 16 17:59:29.455728 dockerd[1981]: time="2026-01-16T17:59:29.455690871Z" level=info msg="Daemon has completed initialization" Jan 16 17:59:29.455954 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 16 17:59:29.455000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:29.456225 dockerd[1981]: time="2026-01-16T17:59:29.455816858Z" level=info msg="API listen on /run/docker.sock" Jan 16 17:59:30.673350 containerd[1658]: time="2026-01-16T17:59:30.673290581Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\"" Jan 16 17:59:31.483623 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2459459190.mount: Deactivated successfully. Jan 16 17:59:32.094991 containerd[1658]: time="2026-01-16T17:59:32.094463877Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:32.095588 containerd[1658]: time="2026-01-16T17:59:32.095535960Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.11: active requests=0, bytes read=24845792" Jan 16 17:59:32.097984 containerd[1658]: time="2026-01-16T17:59:32.097955648Z" level=info msg="ImageCreate event name:\"sha256:58951ea1a0b5de44646ea292c94b9350f33f22d147fccfd84bdc405eaabc442c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:32.101625 containerd[1658]: time="2026-01-16T17:59:32.101587539Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:32.102649 containerd[1658]: time="2026-01-16T17:59:32.102624502Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.11\" with image id \"sha256:58951ea1a0b5de44646ea292c94b9350f33f22d147fccfd84bdc405eaabc442c\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:41eaecaed9af0ca8ab36d7794819c7df199e68c6c6ee0649114d713c495f8bd5\", size \"26438581\" in 1.42929228s" Jan 16 17:59:32.102748 containerd[1658]: time="2026-01-16T17:59:32.102732662Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.11\" returns image reference \"sha256:58951ea1a0b5de44646ea292c94b9350f33f22d147fccfd84bdc405eaabc442c\"" Jan 16 17:59:32.103614 containerd[1658]: time="2026-01-16T17:59:32.103568905Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\"" Jan 16 17:59:33.750651 containerd[1658]: time="2026-01-16T17:59:33.750604169Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:33.751926 containerd[1658]: time="2026-01-16T17:59:33.751846173Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.11: active requests=0, bytes read=22613932" Jan 16 17:59:33.753388 containerd[1658]: time="2026-01-16T17:59:33.753343858Z" level=info msg="ImageCreate event name:\"sha256:82766e5f2d560b930b7069c03ec1366dc8fdb4a490c3005266d2fdc4ca21c2fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:33.756575 containerd[1658]: time="2026-01-16T17:59:33.756541267Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:33.758104 containerd[1658]: time="2026-01-16T17:59:33.758002712Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.11\" with image id \"sha256:82766e5f2d560b930b7069c03ec1366dc8fdb4a490c3005266d2fdc4ca21c2fc\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:ce7b2ead5eef1a1554ef28b2b79596c6a8c6d506a87a7ab1381e77fe3d72f55f\", size \"24206567\" in 1.654373607s" Jan 16 17:59:33.758104 containerd[1658]: time="2026-01-16T17:59:33.758029232Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.11\" returns image reference \"sha256:82766e5f2d560b930b7069c03ec1366dc8fdb4a490c3005266d2fdc4ca21c2fc\"" Jan 16 17:59:33.758515 containerd[1658]: time="2026-01-16T17:59:33.758494353Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\"" Jan 16 17:59:35.146867 containerd[1658]: time="2026-01-16T17:59:35.146776284Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:35.147934 containerd[1658]: time="2026-01-16T17:59:35.147877928Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.11: active requests=0, bytes read=17608611" Jan 16 17:59:35.149553 containerd[1658]: time="2026-01-16T17:59:35.149520413Z" level=info msg="ImageCreate event name:\"sha256:cfa17ff3d66343f03eadbc235264b0615de49cc1f43da12cddba27d80c61f2c6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:35.154034 containerd[1658]: time="2026-01-16T17:59:35.153077264Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:35.154128 containerd[1658]: time="2026-01-16T17:59:35.153976106Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.11\" with image id \"sha256:cfa17ff3d66343f03eadbc235264b0615de49cc1f43da12cddba27d80c61f2c6\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b3039587bbe70e61a6aeaff56c21fdeeef104524a31f835bcc80887d40b8e6b2\", size \"19201246\" in 1.395452513s" Jan 16 17:59:35.154128 containerd[1658]: time="2026-01-16T17:59:35.154121267Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.11\" returns image reference \"sha256:cfa17ff3d66343f03eadbc235264b0615de49cc1f43da12cddba27d80c61f2c6\"" Jan 16 17:59:35.154582 containerd[1658]: time="2026-01-16T17:59:35.154519188Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\"" Jan 16 17:59:35.845017 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 16 17:59:35.846218 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:59:35.986330 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:59:35.986000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:35.989587 kernel: kauditd_printk_skb: 132 callbacks suppressed Jan 16 17:59:35.989657 kernel: audit: type=1130 audit(1768586375.986:294): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:36.009782 (kubelet)[2277]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 16 17:59:36.054022 kubelet[2277]: E0116 17:59:36.053927 2277 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 16 17:59:36.056000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 16 17:59:36.056231 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 16 17:59:36.056358 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 16 17:59:36.056734 systemd[1]: kubelet.service: Consumed 142ms CPU time, 109.7M memory peak. Jan 16 17:59:36.061451 kernel: audit: type=1131 audit(1768586376.056:295): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 16 17:59:36.161770 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4180728834.mount: Deactivated successfully. Jan 16 17:59:36.383396 containerd[1658]: time="2026-01-16T17:59:36.382919021Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:36.383870 containerd[1658]: time="2026-01-16T17:59:36.383829744Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.11: active requests=0, bytes read=17713718" Jan 16 17:59:36.384763 containerd[1658]: time="2026-01-16T17:59:36.384737426Z" level=info msg="ImageCreate event name:\"sha256:dcdb790dc2bfe6e0b86f702c7f336a38eaef34f6370eb6ff68f4e5b03ed4d425\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:36.387075 containerd[1658]: time="2026-01-16T17:59:36.387045913Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:36.387998 containerd[1658]: time="2026-01-16T17:59:36.387964236Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.11\" with image id \"sha256:dcdb790dc2bfe6e0b86f702c7f336a38eaef34f6370eb6ff68f4e5b03ed4d425\", repo tag \"registry.k8s.io/kube-proxy:v1.32.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:4204f9136c23a867929d32046032fe069b49ad94cf168042405e7d0ec88bdba9\", size \"27557743\" in 1.233413408s" Jan 16 17:59:36.387998 containerd[1658]: time="2026-01-16T17:59:36.387996436Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.11\" returns image reference \"sha256:dcdb790dc2bfe6e0b86f702c7f336a38eaef34f6370eb6ff68f4e5b03ed4d425\"" Jan 16 17:59:36.388566 containerd[1658]: time="2026-01-16T17:59:36.388524998Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jan 16 17:59:36.944042 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1144617323.mount: Deactivated successfully. Jan 16 17:59:37.463317 containerd[1658]: time="2026-01-16T17:59:37.463270278Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:37.464877 containerd[1658]: time="2026-01-16T17:59:37.464832682Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=15956282" Jan 16 17:59:37.466047 containerd[1658]: time="2026-01-16T17:59:37.466005126Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:37.469156 containerd[1658]: time="2026-01-16T17:59:37.469114055Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:37.470869 containerd[1658]: time="2026-01-16T17:59:37.470810301Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.082229422s" Jan 16 17:59:37.470869 containerd[1658]: time="2026-01-16T17:59:37.470839021Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Jan 16 17:59:37.471564 containerd[1658]: time="2026-01-16T17:59:37.471500983Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 16 17:59:37.991646 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4127735278.mount: Deactivated successfully. Jan 16 17:59:37.998197 containerd[1658]: time="2026-01-16T17:59:37.998142150Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 16 17:59:37.999644 containerd[1658]: time="2026-01-16T17:59:37.999599514Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 16 17:59:38.000777 containerd[1658]: time="2026-01-16T17:59:38.000728958Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 16 17:59:38.003587 containerd[1658]: time="2026-01-16T17:59:38.003537486Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 16 17:59:38.004476 containerd[1658]: time="2026-01-16T17:59:38.004406449Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 532.853866ms" Jan 16 17:59:38.004476 containerd[1658]: time="2026-01-16T17:59:38.004461529Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Jan 16 17:59:38.005266 containerd[1658]: time="2026-01-16T17:59:38.005242971Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Jan 16 17:59:38.589903 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2634856128.mount: Deactivated successfully. Jan 16 17:59:40.446969 containerd[1658]: time="2026-01-16T17:59:40.446870022Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:40.448250 containerd[1658]: time="2026-01-16T17:59:40.448042066Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=56456774" Jan 16 17:59:40.449178 containerd[1658]: time="2026-01-16T17:59:40.449128109Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:40.452716 containerd[1658]: time="2026-01-16T17:59:40.452672440Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:59:40.454372 containerd[1658]: time="2026-01-16T17:59:40.454334205Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 2.449063754s" Jan 16 17:59:40.454412 containerd[1658]: time="2026-01-16T17:59:40.454370845Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Jan 16 17:59:46.147959 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 16 17:59:46.151649 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:59:46.301149 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:59:46.300000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:46.304490 kernel: audit: type=1130 audit(1768586386.300:296): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:46.305267 (kubelet)[2434]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 16 17:59:46.336052 kubelet[2434]: E0116 17:59:46.335952 2434 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 16 17:59:46.338000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 16 17:59:46.338713 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 16 17:59:46.338836 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 16 17:59:46.339185 systemd[1]: kubelet.service: Consumed 133ms CPU time, 105.3M memory peak. Jan 16 17:59:46.342457 kernel: audit: type=1131 audit(1768586386.338:297): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 16 17:59:47.089871 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:59:47.089000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:47.090173 systemd[1]: kubelet.service: Consumed 133ms CPU time, 105.3M memory peak. Jan 16 17:59:47.089000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:47.096820 kernel: audit: type=1130 audit(1768586387.089:298): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:47.096881 kernel: audit: type=1131 audit(1768586387.089:299): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:47.097117 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:59:47.117267 systemd[1]: Reload requested from client PID 2449 ('systemctl') (unit session-10.scope)... Jan 16 17:59:47.117284 systemd[1]: Reloading... Jan 16 17:59:47.206462 zram_generator::config[2495]: No configuration found. Jan 16 17:59:47.370886 systemd[1]: Reloading finished in 253 ms. Jan 16 17:59:47.386446 kernel: audit: type=1334 audit(1768586387.384:300): prog-id=63 op=LOAD Jan 16 17:59:47.386537 kernel: audit: type=1334 audit(1768586387.385:301): prog-id=43 op=UNLOAD Jan 16 17:59:47.386557 kernel: audit: type=1334 audit(1768586387.385:302): prog-id=64 op=LOAD Jan 16 17:59:47.384000 audit: BPF prog-id=63 op=LOAD Jan 16 17:59:47.385000 audit: BPF prog-id=43 op=UNLOAD Jan 16 17:59:47.385000 audit: BPF prog-id=64 op=LOAD Jan 16 17:59:47.385000 audit: BPF prog-id=59 op=UNLOAD Jan 16 17:59:47.387957 kernel: audit: type=1334 audit(1768586387.385:303): prog-id=59 op=UNLOAD Jan 16 17:59:47.388010 kernel: audit: type=1334 audit(1768586387.386:304): prog-id=65 op=LOAD Jan 16 17:59:47.388028 kernel: audit: type=1334 audit(1768586387.387:305): prog-id=66 op=LOAD Jan 16 17:59:47.386000 audit: BPF prog-id=65 op=LOAD Jan 16 17:59:47.387000 audit: BPF prog-id=66 op=LOAD Jan 16 17:59:47.387000 audit: BPF prog-id=53 op=UNLOAD Jan 16 17:59:47.387000 audit: BPF prog-id=54 op=UNLOAD Jan 16 17:59:47.388000 audit: BPF prog-id=67 op=LOAD Jan 16 17:59:47.388000 audit: BPF prog-id=55 op=UNLOAD Jan 16 17:59:47.389000 audit: BPF prog-id=68 op=LOAD Jan 16 17:59:47.389000 audit: BPF prog-id=69 op=LOAD Jan 16 17:59:47.389000 audit: BPF prog-id=56 op=UNLOAD Jan 16 17:59:47.389000 audit: BPF prog-id=57 op=UNLOAD Jan 16 17:59:47.389000 audit: BPF prog-id=70 op=LOAD Jan 16 17:59:47.389000 audit: BPF prog-id=44 op=UNLOAD Jan 16 17:59:47.389000 audit: BPF prog-id=71 op=LOAD Jan 16 17:59:47.389000 audit: BPF prog-id=72 op=LOAD Jan 16 17:59:47.389000 audit: BPF prog-id=45 op=UNLOAD Jan 16 17:59:47.389000 audit: BPF prog-id=46 op=UNLOAD Jan 16 17:59:47.390000 audit: BPF prog-id=73 op=LOAD Jan 16 17:59:47.403000 audit: BPF prog-id=50 op=UNLOAD Jan 16 17:59:47.403000 audit: BPF prog-id=74 op=LOAD Jan 16 17:59:47.403000 audit: BPF prog-id=75 op=LOAD Jan 16 17:59:47.403000 audit: BPF prog-id=51 op=UNLOAD Jan 16 17:59:47.403000 audit: BPF prog-id=52 op=UNLOAD Jan 16 17:59:47.404000 audit: BPF prog-id=76 op=LOAD Jan 16 17:59:47.404000 audit: BPF prog-id=47 op=UNLOAD Jan 16 17:59:47.404000 audit: BPF prog-id=77 op=LOAD Jan 16 17:59:47.404000 audit: BPF prog-id=78 op=LOAD Jan 16 17:59:47.404000 audit: BPF prog-id=48 op=UNLOAD Jan 16 17:59:47.404000 audit: BPF prog-id=49 op=UNLOAD Jan 16 17:59:47.405000 audit: BPF prog-id=79 op=LOAD Jan 16 17:59:47.405000 audit: BPF prog-id=58 op=UNLOAD Jan 16 17:59:47.406000 audit: BPF prog-id=80 op=LOAD Jan 16 17:59:47.406000 audit: BPF prog-id=60 op=UNLOAD Jan 16 17:59:47.406000 audit: BPF prog-id=81 op=LOAD Jan 16 17:59:47.406000 audit: BPF prog-id=82 op=LOAD Jan 16 17:59:47.406000 audit: BPF prog-id=61 op=UNLOAD Jan 16 17:59:47.406000 audit: BPF prog-id=62 op=UNLOAD Jan 16 17:59:47.422223 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 16 17:59:47.422292 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 16 17:59:47.422726 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:59:47.422000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 16 17:59:47.422806 systemd[1]: kubelet.service: Consumed 91ms CPU time, 95.4M memory peak. Jan 16 17:59:47.424147 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:59:47.443816 update_engine[1645]: I20260116 17:59:47.443470 1645 update_attempter.cc:509] Updating boot flags... Jan 16 17:59:47.583402 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:59:47.583000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:47.588302 (kubelet)[2559]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 16 17:59:47.625600 kubelet[2559]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 16 17:59:47.625600 kubelet[2559]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 16 17:59:47.625600 kubelet[2559]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 16 17:59:47.625901 kubelet[2559]: I0116 17:59:47.625583 2559 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 16 17:59:48.518471 kubelet[2559]: I0116 17:59:48.518432 2559 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 16 17:59:48.518471 kubelet[2559]: I0116 17:59:48.518464 2559 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 16 17:59:48.518744 kubelet[2559]: I0116 17:59:48.518713 2559 server.go:954] "Client rotation is on, will bootstrap in background" Jan 16 17:59:48.549396 kubelet[2559]: E0116 17:59:48.549359 2559 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.7.62:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.7.62:6443: connect: connection refused" logger="UnhandledError" Jan 16 17:59:48.551236 kubelet[2559]: I0116 17:59:48.551216 2559 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 16 17:59:48.558159 kubelet[2559]: I0116 17:59:48.558137 2559 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 16 17:59:48.560829 kubelet[2559]: I0116 17:59:48.560783 2559 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 16 17:59:48.562522 kubelet[2559]: I0116 17:59:48.562475 2559 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 16 17:59:48.562711 kubelet[2559]: I0116 17:59:48.562517 2559 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4580-0-0-p-7f6b5ebc40","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 16 17:59:48.562836 kubelet[2559]: I0116 17:59:48.562787 2559 topology_manager.go:138] "Creating topology manager with none policy" Jan 16 17:59:48.562836 kubelet[2559]: I0116 17:59:48.562796 2559 container_manager_linux.go:304] "Creating device plugin manager" Jan 16 17:59:48.563022 kubelet[2559]: I0116 17:59:48.562991 2559 state_mem.go:36] "Initialized new in-memory state store" Jan 16 17:59:48.567416 kubelet[2559]: I0116 17:59:48.567399 2559 kubelet.go:446] "Attempting to sync node with API server" Jan 16 17:59:48.567473 kubelet[2559]: I0116 17:59:48.567434 2559 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 16 17:59:48.567473 kubelet[2559]: I0116 17:59:48.567459 2559 kubelet.go:352] "Adding apiserver pod source" Jan 16 17:59:48.567473 kubelet[2559]: I0116 17:59:48.567468 2559 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 16 17:59:48.570501 kubelet[2559]: I0116 17:59:48.570478 2559 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 16 17:59:48.571162 kubelet[2559]: I0116 17:59:48.571109 2559 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 16 17:59:48.571256 kubelet[2559]: W0116 17:59:48.571239 2559 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 16 17:59:48.571451 kubelet[2559]: W0116 17:59:48.571381 2559 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.7.62:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.7.62:6443: connect: connection refused Jan 16 17:59:48.571542 kubelet[2559]: E0116 17:59:48.571522 2559 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.7.62:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.7.62:6443: connect: connection refused" logger="UnhandledError" Jan 16 17:59:48.572113 kubelet[2559]: I0116 17:59:48.572070 2559 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 16 17:59:48.572113 kubelet[2559]: W0116 17:59:48.572069 2559 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.7.62:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4580-0-0-p-7f6b5ebc40&limit=500&resourceVersion=0": dial tcp 10.0.7.62:6443: connect: connection refused Jan 16 17:59:48.572113 kubelet[2559]: I0116 17:59:48.572107 2559 server.go:1287] "Started kubelet" Jan 16 17:59:48.572113 kubelet[2559]: E0116 17:59:48.572112 2559 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.7.62:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4580-0-0-p-7f6b5ebc40&limit=500&resourceVersion=0\": dial tcp 10.0.7.62:6443: connect: connection refused" logger="UnhandledError" Jan 16 17:59:48.574458 kubelet[2559]: I0116 17:59:48.573532 2559 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 16 17:59:48.574458 kubelet[2559]: I0116 17:59:48.574052 2559 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 16 17:59:48.574458 kubelet[2559]: I0116 17:59:48.574319 2559 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 16 17:59:48.574458 kubelet[2559]: I0116 17:59:48.574375 2559 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 16 17:59:48.575190 kubelet[2559]: I0116 17:59:48.575164 2559 server.go:479] "Adding debug handlers to kubelet server" Jan 16 17:59:48.578064 kubelet[2559]: I0116 17:59:48.577934 2559 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 16 17:59:48.578187 kubelet[2559]: E0116 17:59:48.577806 2559 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.7.62:6443/api/v1/namespaces/default/events\": dial tcp 10.0.7.62:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4580-0-0-p-7f6b5ebc40.188b47f3356e31d4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4580-0-0-p-7f6b5ebc40,UID:ci-4580-0-0-p-7f6b5ebc40,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4580-0-0-p-7f6b5ebc40,},FirstTimestamp:2026-01-16 17:59:48.572090836 +0000 UTC m=+0.980231333,LastTimestamp:2026-01-16 17:59:48.572090836 +0000 UTC m=+0.980231333,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4580-0-0-p-7f6b5ebc40,}" Jan 16 17:59:48.578629 kubelet[2559]: E0116 17:59:48.578610 2559 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4580-0-0-p-7f6b5ebc40\" not found" Jan 16 17:59:48.578722 kubelet[2559]: I0116 17:59:48.578712 2559 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 16 17:59:48.578940 kubelet[2559]: I0116 17:59:48.578920 2559 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 16 17:59:48.579052 kubelet[2559]: I0116 17:59:48.579041 2559 reconciler.go:26] "Reconciler: start to sync state" Jan 16 17:59:48.579411 kubelet[2559]: W0116 17:59:48.579377 2559 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.7.62:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.7.62:6443: connect: connection refused Jan 16 17:59:48.579534 kubelet[2559]: E0116 17:59:48.579515 2559 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.7.62:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.7.62:6443: connect: connection refused" logger="UnhandledError" Jan 16 17:59:48.579748 kubelet[2559]: I0116 17:59:48.579730 2559 factory.go:221] Registration of the systemd container factory successfully Jan 16 17:59:48.579866 kubelet[2559]: I0116 17:59:48.579850 2559 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 16 17:59:48.579000 audit[2572]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2572 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:48.579000 audit[2572]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffd03ed1b0 a2=0 a3=0 items=0 ppid=2559 pid=2572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:48.579000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 16 17:59:48.580000 audit[2573]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2573 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:48.580000 audit[2573]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffbc911b0 a2=0 a3=0 items=0 ppid=2559 pid=2573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:48.580000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 16 17:59:48.582752 kubelet[2559]: E0116 17:59:48.582674 2559 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.7.62:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-7f6b5ebc40?timeout=10s\": dial tcp 10.0.7.62:6443: connect: connection refused" interval="200ms" Jan 16 17:59:48.583589 kubelet[2559]: I0116 17:59:48.583551 2559 factory.go:221] Registration of the containerd container factory successfully Jan 16 17:59:48.584000 audit[2575]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2575 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:48.584000 audit[2575]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffe37d3e60 a2=0 a3=0 items=0 ppid=2559 pid=2575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:48.584000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 16 17:59:48.586000 audit[2577]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2577 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:48.586000 audit[2577]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffd6580a90 a2=0 a3=0 items=0 ppid=2559 pid=2577 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:48.586000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 16 17:59:48.587670 kubelet[2559]: E0116 17:59:48.587603 2559 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 16 17:59:48.593283 kubelet[2559]: I0116 17:59:48.593260 2559 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 16 17:59:48.593283 kubelet[2559]: I0116 17:59:48.593276 2559 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 16 17:59:48.593400 kubelet[2559]: I0116 17:59:48.593313 2559 state_mem.go:36] "Initialized new in-memory state store" Jan 16 17:59:48.595000 audit[2582]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2582 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:48.595000 audit[2582]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=ffffdc394fa0 a2=0 a3=0 items=0 ppid=2559 pid=2582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:48.595000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 16 17:59:48.596506 kubelet[2559]: I0116 17:59:48.596210 2559 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 16 17:59:48.596000 audit[2583]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2583 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:48.596000 audit[2583]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffdd149870 a2=0 a3=0 items=0 ppid=2559 pid=2583 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:48.596000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 16 17:59:48.596000 audit[2584]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2584 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:48.596000 audit[2584]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffff498f30 a2=0 a3=0 items=0 ppid=2559 pid=2584 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:48.596000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 16 17:59:48.597971 kubelet[2559]: I0116 17:59:48.597665 2559 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 16 17:59:48.597971 kubelet[2559]: I0116 17:59:48.597678 2559 policy_none.go:49] "None policy: Start" Jan 16 17:59:48.597971 kubelet[2559]: I0116 17:59:48.597687 2559 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 16 17:59:48.597971 kubelet[2559]: I0116 17:59:48.597696 2559 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 16 17:59:48.597971 kubelet[2559]: I0116 17:59:48.597705 2559 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 16 17:59:48.597971 kubelet[2559]: I0116 17:59:48.597713 2559 kubelet.go:2382] "Starting kubelet main sync loop" Jan 16 17:59:48.597971 kubelet[2559]: E0116 17:59:48.597748 2559 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 16 17:59:48.597971 kubelet[2559]: I0116 17:59:48.597707 2559 state_mem.go:35] "Initializing new in-memory state store" Jan 16 17:59:48.598000 audit[2586]: NETFILTER_CFG table=nat:49 family=2 entries=1 op=nft_register_chain pid=2586 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:48.598000 audit[2586]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffff5f0650 a2=0 a3=0 items=0 ppid=2559 pid=2586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:48.598000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 16 17:59:48.598000 audit[2585]: NETFILTER_CFG table=mangle:50 family=10 entries=1 op=nft_register_chain pid=2585 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:48.598000 audit[2585]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe9228660 a2=0 a3=0 items=0 ppid=2559 pid=2585 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:48.598000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 16 17:59:48.599595 kubelet[2559]: W0116 17:59:48.599516 2559 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.7.62:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.7.62:6443: connect: connection refused Jan 16 17:59:48.599595 kubelet[2559]: E0116 17:59:48.599563 2559 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.7.62:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.7.62:6443: connect: connection refused" logger="UnhandledError" Jan 16 17:59:48.599000 audit[2588]: NETFILTER_CFG table=nat:51 family=10 entries=1 op=nft_register_chain pid=2588 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:48.599000 audit[2588]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffea02ad30 a2=0 a3=0 items=0 ppid=2559 pid=2588 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:48.599000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 16 17:59:48.600000 audit[2587]: NETFILTER_CFG table=filter:52 family=2 entries=1 op=nft_register_chain pid=2587 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:59:48.600000 audit[2587]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffee27d160 a2=0 a3=0 items=0 ppid=2559 pid=2587 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:48.600000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 16 17:59:48.600000 audit[2589]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2589 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:59:48.600000 audit[2589]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe71b5900 a2=0 a3=0 items=0 ppid=2559 pid=2589 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:48.600000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 16 17:59:48.603804 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 16 17:59:48.617184 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 16 17:59:48.629186 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 16 17:59:48.630560 kubelet[2559]: I0116 17:59:48.630537 2559 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 16 17:59:48.630790 kubelet[2559]: I0116 17:59:48.630720 2559 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 16 17:59:48.630790 kubelet[2559]: I0116 17:59:48.630731 2559 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 16 17:59:48.631003 kubelet[2559]: I0116 17:59:48.630980 2559 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 16 17:59:48.632113 kubelet[2559]: E0116 17:59:48.632094 2559 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 16 17:59:48.632155 kubelet[2559]: E0116 17:59:48.632138 2559 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4580-0-0-p-7f6b5ebc40\" not found" Jan 16 17:59:48.706731 systemd[1]: Created slice kubepods-burstable-pod9d4bef785720e8fe9fa0ba95ba727392.slice - libcontainer container kubepods-burstable-pod9d4bef785720e8fe9fa0ba95ba727392.slice. Jan 16 17:59:48.728772 kubelet[2559]: E0116 17:59:48.728729 2559 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-7f6b5ebc40\" not found" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:48.731222 systemd[1]: Created slice kubepods-burstable-podc1e3b217a329036b0919e1d721717bf4.slice - libcontainer container kubepods-burstable-podc1e3b217a329036b0919e1d721717bf4.slice. Jan 16 17:59:48.732054 kubelet[2559]: I0116 17:59:48.732035 2559 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:48.732816 kubelet[2559]: E0116 17:59:48.732772 2559 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.7.62:6443/api/v1/nodes\": dial tcp 10.0.7.62:6443: connect: connection refused" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:48.744070 kubelet[2559]: E0116 17:59:48.744014 2559 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-7f6b5ebc40\" not found" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:48.746310 systemd[1]: Created slice kubepods-burstable-pode05f0c3cd193c08a433dbb0fa21854cf.slice - libcontainer container kubepods-burstable-pode05f0c3cd193c08a433dbb0fa21854cf.slice. Jan 16 17:59:48.747883 kubelet[2559]: E0116 17:59:48.747830 2559 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-7f6b5ebc40\" not found" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:48.784020 kubelet[2559]: E0116 17:59:48.783842 2559 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.7.62:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-7f6b5ebc40?timeout=10s\": dial tcp 10.0.7.62:6443: connect: connection refused" interval="400ms" Jan 16 17:59:48.880219 kubelet[2559]: I0116 17:59:48.880149 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c1e3b217a329036b0919e1d721717bf4-k8s-certs\") pod \"kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"c1e3b217a329036b0919e1d721717bf4\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:48.880219 kubelet[2559]: I0116 17:59:48.880189 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c1e3b217a329036b0919e1d721717bf4-kubeconfig\") pod \"kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"c1e3b217a329036b0919e1d721717bf4\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:48.880219 kubelet[2559]: I0116 17:59:48.880207 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c1e3b217a329036b0919e1d721717bf4-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"c1e3b217a329036b0919e1d721717bf4\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:48.880219 kubelet[2559]: I0116 17:59:48.880227 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9d4bef785720e8fe9fa0ba95ba727392-k8s-certs\") pod \"kube-apiserver-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"9d4bef785720e8fe9fa0ba95ba727392\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:48.880398 kubelet[2559]: I0116 17:59:48.880251 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c1e3b217a329036b0919e1d721717bf4-ca-certs\") pod \"kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"c1e3b217a329036b0919e1d721717bf4\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:48.880398 kubelet[2559]: I0116 17:59:48.880269 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c1e3b217a329036b0919e1d721717bf4-flexvolume-dir\") pod \"kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"c1e3b217a329036b0919e1d721717bf4\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:48.880398 kubelet[2559]: I0116 17:59:48.880284 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9d4bef785720e8fe9fa0ba95ba727392-ca-certs\") pod \"kube-apiserver-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"9d4bef785720e8fe9fa0ba95ba727392\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:48.880398 kubelet[2559]: I0116 17:59:48.880299 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9d4bef785720e8fe9fa0ba95ba727392-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"9d4bef785720e8fe9fa0ba95ba727392\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:48.880398 kubelet[2559]: I0116 17:59:48.880318 2559 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e05f0c3cd193c08a433dbb0fa21854cf-kubeconfig\") pod \"kube-scheduler-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"e05f0c3cd193c08a433dbb0fa21854cf\") " pod="kube-system/kube-scheduler-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:48.935295 kubelet[2559]: I0116 17:59:48.935274 2559 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:48.935588 kubelet[2559]: E0116 17:59:48.935565 2559 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.7.62:6443/api/v1/nodes\": dial tcp 10.0.7.62:6443: connect: connection refused" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:49.030412 containerd[1658]: time="2026-01-16T17:59:49.030370425Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4580-0-0-p-7f6b5ebc40,Uid:9d4bef785720e8fe9fa0ba95ba727392,Namespace:kube-system,Attempt:0,}" Jan 16 17:59:49.046205 containerd[1658]: time="2026-01-16T17:59:49.045949473Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40,Uid:c1e3b217a329036b0919e1d721717bf4,Namespace:kube-system,Attempt:0,}" Jan 16 17:59:49.049198 containerd[1658]: time="2026-01-16T17:59:49.049156402Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4580-0-0-p-7f6b5ebc40,Uid:e05f0c3cd193c08a433dbb0fa21854cf,Namespace:kube-system,Attempt:0,}" Jan 16 17:59:49.053258 containerd[1658]: time="2026-01-16T17:59:49.053225055Z" level=info msg="connecting to shim 7c651315e54e3b11aea0edb5a38fc12c443f3d5269fc0248465045a29d7af9e0" address="unix:///run/containerd/s/438ada51f92ef4218aa0f9c235bfc57ef34d6490979b8f9323dbc34de2e48c1e" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:59:49.075565 containerd[1658]: time="2026-01-16T17:59:49.075392242Z" level=info msg="connecting to shim c753fbe6018164acdeec60a62a596185ee77b85ddc6e82e0db403e8c654a9197" address="unix:///run/containerd/s/dddb982b1fd99a2f0c8b7c7189895e05c0938d13a57a539f7eeac68957f5fd49" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:59:49.078639 systemd[1]: Started cri-containerd-7c651315e54e3b11aea0edb5a38fc12c443f3d5269fc0248465045a29d7af9e0.scope - libcontainer container 7c651315e54e3b11aea0edb5a38fc12c443f3d5269fc0248465045a29d7af9e0. Jan 16 17:59:49.091226 containerd[1658]: time="2026-01-16T17:59:49.091096730Z" level=info msg="connecting to shim 8bf801f1f330784ad07cd6cec66806776bcf15f176e992b41635d516a430ebf0" address="unix:///run/containerd/s/586f78604bf3c30db8b94bb7d3c443759db9b48c7b1fd97f938c0ab4a105d37b" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:59:49.103688 systemd[1]: Started cri-containerd-c753fbe6018164acdeec60a62a596185ee77b85ddc6e82e0db403e8c654a9197.scope - libcontainer container c753fbe6018164acdeec60a62a596185ee77b85ddc6e82e0db403e8c654a9197. Jan 16 17:59:49.105000 audit: BPF prog-id=83 op=LOAD Jan 16 17:59:49.105000 audit: BPF prog-id=84 op=LOAD Jan 16 17:59:49.105000 audit[2609]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2598 pid=2609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.105000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763363531333135653534653362313161656130656462356133386663 Jan 16 17:59:49.106000 audit: BPF prog-id=84 op=UNLOAD Jan 16 17:59:49.106000 audit[2609]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2598 pid=2609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.106000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763363531333135653534653362313161656130656462356133386663 Jan 16 17:59:49.107000 audit: BPF prog-id=85 op=LOAD Jan 16 17:59:49.107000 audit[2609]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2598 pid=2609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.107000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763363531333135653534653362313161656130656462356133386663 Jan 16 17:59:49.107000 audit: BPF prog-id=86 op=LOAD Jan 16 17:59:49.107000 audit[2609]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2598 pid=2609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.107000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763363531333135653534653362313161656130656462356133386663 Jan 16 17:59:49.107000 audit: BPF prog-id=86 op=UNLOAD Jan 16 17:59:49.107000 audit[2609]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2598 pid=2609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.107000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763363531333135653534653362313161656130656462356133386663 Jan 16 17:59:49.107000 audit: BPF prog-id=85 op=UNLOAD Jan 16 17:59:49.107000 audit[2609]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2598 pid=2609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.107000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763363531333135653534653362313161656130656462356133386663 Jan 16 17:59:49.107000 audit: BPF prog-id=87 op=LOAD Jan 16 17:59:49.107000 audit[2609]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2598 pid=2609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.107000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763363531333135653534653362313161656130656462356133386663 Jan 16 17:59:49.110152 systemd[1]: Started cri-containerd-8bf801f1f330784ad07cd6cec66806776bcf15f176e992b41635d516a430ebf0.scope - libcontainer container 8bf801f1f330784ad07cd6cec66806776bcf15f176e992b41635d516a430ebf0. Jan 16 17:59:49.118000 audit: BPF prog-id=88 op=LOAD Jan 16 17:59:49.118000 audit: BPF prog-id=89 op=LOAD Jan 16 17:59:49.118000 audit[2644]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=2631 pid=2644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.118000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337353366626536303138313634616364656563363061363261353936 Jan 16 17:59:49.118000 audit: BPF prog-id=89 op=UNLOAD Jan 16 17:59:49.118000 audit[2644]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2631 pid=2644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.118000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337353366626536303138313634616364656563363061363261353936 Jan 16 17:59:49.118000 audit: BPF prog-id=90 op=LOAD Jan 16 17:59:49.118000 audit[2644]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=2631 pid=2644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.118000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337353366626536303138313634616364656563363061363261353936 Jan 16 17:59:49.118000 audit: BPF prog-id=91 op=LOAD Jan 16 17:59:49.118000 audit[2644]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=2631 pid=2644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.118000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337353366626536303138313634616364656563363061363261353936 Jan 16 17:59:49.119000 audit: BPF prog-id=91 op=UNLOAD Jan 16 17:59:49.119000 audit[2644]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2631 pid=2644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.119000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337353366626536303138313634616364656563363061363261353936 Jan 16 17:59:49.119000 audit: BPF prog-id=90 op=UNLOAD Jan 16 17:59:49.119000 audit[2644]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2631 pid=2644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.119000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337353366626536303138313634616364656563363061363261353936 Jan 16 17:59:49.119000 audit: BPF prog-id=92 op=LOAD Jan 16 17:59:49.119000 audit[2644]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=2631 pid=2644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.119000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337353366626536303138313634616364656563363061363261353936 Jan 16 17:59:49.122000 audit: BPF prog-id=93 op=LOAD Jan 16 17:59:49.123000 audit: BPF prog-id=94 op=LOAD Jan 16 17:59:49.123000 audit[2680]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2660 pid=2680 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.123000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862663830316631663333303738346164303763643663656336363830 Jan 16 17:59:49.123000 audit: BPF prog-id=94 op=UNLOAD Jan 16 17:59:49.123000 audit[2680]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2660 pid=2680 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.123000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862663830316631663333303738346164303763643663656336363830 Jan 16 17:59:49.123000 audit: BPF prog-id=95 op=LOAD Jan 16 17:59:49.123000 audit[2680]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2660 pid=2680 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.123000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862663830316631663333303738346164303763643663656336363830 Jan 16 17:59:49.123000 audit: BPF prog-id=96 op=LOAD Jan 16 17:59:49.123000 audit[2680]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2660 pid=2680 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.123000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862663830316631663333303738346164303763643663656336363830 Jan 16 17:59:49.123000 audit: BPF prog-id=96 op=UNLOAD Jan 16 17:59:49.123000 audit[2680]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2660 pid=2680 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.123000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862663830316631663333303738346164303763643663656336363830 Jan 16 17:59:49.123000 audit: BPF prog-id=95 op=UNLOAD Jan 16 17:59:49.123000 audit[2680]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2660 pid=2680 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.123000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862663830316631663333303738346164303763643663656336363830 Jan 16 17:59:49.123000 audit: BPF prog-id=97 op=LOAD Jan 16 17:59:49.123000 audit[2680]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2660 pid=2680 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.123000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862663830316631663333303738346164303763643663656336363830 Jan 16 17:59:49.143011 containerd[1658]: time="2026-01-16T17:59:49.142957607Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4580-0-0-p-7f6b5ebc40,Uid:9d4bef785720e8fe9fa0ba95ba727392,Namespace:kube-system,Attempt:0,} returns sandbox id \"7c651315e54e3b11aea0edb5a38fc12c443f3d5269fc0248465045a29d7af9e0\"" Jan 16 17:59:49.146379 containerd[1658]: time="2026-01-16T17:59:49.146349977Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40,Uid:c1e3b217a329036b0919e1d721717bf4,Namespace:kube-system,Attempt:0,} returns sandbox id \"c753fbe6018164acdeec60a62a596185ee77b85ddc6e82e0db403e8c654a9197\"" Jan 16 17:59:49.146517 containerd[1658]: time="2026-01-16T17:59:49.146486018Z" level=info msg="CreateContainer within sandbox \"7c651315e54e3b11aea0edb5a38fc12c443f3d5269fc0248465045a29d7af9e0\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 16 17:59:49.149127 containerd[1658]: time="2026-01-16T17:59:49.148812585Z" level=info msg="CreateContainer within sandbox \"c753fbe6018164acdeec60a62a596185ee77b85ddc6e82e0db403e8c654a9197\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 16 17:59:49.156768 containerd[1658]: time="2026-01-16T17:59:49.156737649Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4580-0-0-p-7f6b5ebc40,Uid:e05f0c3cd193c08a433dbb0fa21854cf,Namespace:kube-system,Attempt:0,} returns sandbox id \"8bf801f1f330784ad07cd6cec66806776bcf15f176e992b41635d516a430ebf0\"" Jan 16 17:59:49.159832 containerd[1658]: time="2026-01-16T17:59:49.159801418Z" level=info msg="CreateContainer within sandbox \"8bf801f1f330784ad07cd6cec66806776bcf15f176e992b41635d516a430ebf0\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 16 17:59:49.162883 containerd[1658]: time="2026-01-16T17:59:49.162849787Z" level=info msg="Container ef712cc50b0e2fabd67e8e5aff56c5d775115d00a2a217aa1382163b2808a919: CDI devices from CRI Config.CDIDevices: []" Jan 16 17:59:49.167174 containerd[1658]: time="2026-01-16T17:59:49.167141760Z" level=info msg="Container 9faa95818482f58af12cff6b2566c18653b9fe7925f9e1535b63a59d8fa9c546: CDI devices from CRI Config.CDIDevices: []" Jan 16 17:59:49.174575 containerd[1658]: time="2026-01-16T17:59:49.174542743Z" level=info msg="CreateContainer within sandbox \"7c651315e54e3b11aea0edb5a38fc12c443f3d5269fc0248465045a29d7af9e0\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"ef712cc50b0e2fabd67e8e5aff56c5d775115d00a2a217aa1382163b2808a919\"" Jan 16 17:59:49.176095 containerd[1658]: time="2026-01-16T17:59:49.175770106Z" level=info msg="StartContainer for \"ef712cc50b0e2fabd67e8e5aff56c5d775115d00a2a217aa1382163b2808a919\"" Jan 16 17:59:49.176280 containerd[1658]: time="2026-01-16T17:59:49.176257348Z" level=info msg="Container 9bf6682c38eeafe1f91e0d8d24ff6f42acc1ff588bf5c6ddda9ebaabc6b16041: CDI devices from CRI Config.CDIDevices: []" Jan 16 17:59:49.177115 containerd[1658]: time="2026-01-16T17:59:49.177093270Z" level=info msg="connecting to shim ef712cc50b0e2fabd67e8e5aff56c5d775115d00a2a217aa1382163b2808a919" address="unix:///run/containerd/s/438ada51f92ef4218aa0f9c235bfc57ef34d6490979b8f9323dbc34de2e48c1e" protocol=ttrpc version=3 Jan 16 17:59:49.182615 containerd[1658]: time="2026-01-16T17:59:49.182571967Z" level=info msg="CreateContainer within sandbox \"c753fbe6018164acdeec60a62a596185ee77b85ddc6e82e0db403e8c654a9197\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"9faa95818482f58af12cff6b2566c18653b9fe7925f9e1535b63a59d8fa9c546\"" Jan 16 17:59:49.183002 containerd[1658]: time="2026-01-16T17:59:49.182979528Z" level=info msg="StartContainer for \"9faa95818482f58af12cff6b2566c18653b9fe7925f9e1535b63a59d8fa9c546\"" Jan 16 17:59:49.184853 kubelet[2559]: E0116 17:59:49.184804 2559 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.7.62:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-7f6b5ebc40?timeout=10s\": dial tcp 10.0.7.62:6443: connect: connection refused" interval="800ms" Jan 16 17:59:49.185029 containerd[1658]: time="2026-01-16T17:59:49.185000254Z" level=info msg="connecting to shim 9faa95818482f58af12cff6b2566c18653b9fe7925f9e1535b63a59d8fa9c546" address="unix:///run/containerd/s/dddb982b1fd99a2f0c8b7c7189895e05c0938d13a57a539f7eeac68957f5fd49" protocol=ttrpc version=3 Jan 16 17:59:49.185182 containerd[1658]: time="2026-01-16T17:59:49.185102975Z" level=info msg="CreateContainer within sandbox \"8bf801f1f330784ad07cd6cec66806776bcf15f176e992b41635d516a430ebf0\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"9bf6682c38eeafe1f91e0d8d24ff6f42acc1ff588bf5c6ddda9ebaabc6b16041\"" Jan 16 17:59:49.185578 containerd[1658]: time="2026-01-16T17:59:49.185551936Z" level=info msg="StartContainer for \"9bf6682c38eeafe1f91e0d8d24ff6f42acc1ff588bf5c6ddda9ebaabc6b16041\"" Jan 16 17:59:49.187749 containerd[1658]: time="2026-01-16T17:59:49.187713063Z" level=info msg="connecting to shim 9bf6682c38eeafe1f91e0d8d24ff6f42acc1ff588bf5c6ddda9ebaabc6b16041" address="unix:///run/containerd/s/586f78604bf3c30db8b94bb7d3c443759db9b48c7b1fd97f938c0ab4a105d37b" protocol=ttrpc version=3 Jan 16 17:59:49.196613 systemd[1]: Started cri-containerd-ef712cc50b0e2fabd67e8e5aff56c5d775115d00a2a217aa1382163b2808a919.scope - libcontainer container ef712cc50b0e2fabd67e8e5aff56c5d775115d00a2a217aa1382163b2808a919. Jan 16 17:59:49.210815 systemd[1]: Started cri-containerd-9faa95818482f58af12cff6b2566c18653b9fe7925f9e1535b63a59d8fa9c546.scope - libcontainer container 9faa95818482f58af12cff6b2566c18653b9fe7925f9e1535b63a59d8fa9c546. Jan 16 17:59:49.213736 systemd[1]: Started cri-containerd-9bf6682c38eeafe1f91e0d8d24ff6f42acc1ff588bf5c6ddda9ebaabc6b16041.scope - libcontainer container 9bf6682c38eeafe1f91e0d8d24ff6f42acc1ff588bf5c6ddda9ebaabc6b16041. Jan 16 17:59:49.216000 audit: BPF prog-id=98 op=LOAD Jan 16 17:59:49.216000 audit: BPF prog-id=99 op=LOAD Jan 16 17:59:49.216000 audit[2730]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2598 pid=2730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.216000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6566373132636335306230653266616264363765386535616666353663 Jan 16 17:59:49.216000 audit: BPF prog-id=99 op=UNLOAD Jan 16 17:59:49.216000 audit[2730]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2598 pid=2730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.216000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6566373132636335306230653266616264363765386535616666353663 Jan 16 17:59:49.216000 audit: BPF prog-id=100 op=LOAD Jan 16 17:59:49.216000 audit[2730]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2598 pid=2730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.216000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6566373132636335306230653266616264363765386535616666353663 Jan 16 17:59:49.217000 audit: BPF prog-id=101 op=LOAD Jan 16 17:59:49.217000 audit[2730]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2598 pid=2730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.217000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6566373132636335306230653266616264363765386535616666353663 Jan 16 17:59:49.217000 audit: BPF prog-id=101 op=UNLOAD Jan 16 17:59:49.217000 audit[2730]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2598 pid=2730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.217000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6566373132636335306230653266616264363765386535616666353663 Jan 16 17:59:49.217000 audit: BPF prog-id=100 op=UNLOAD Jan 16 17:59:49.217000 audit[2730]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2598 pid=2730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.217000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6566373132636335306230653266616264363765386535616666353663 Jan 16 17:59:49.217000 audit: BPF prog-id=102 op=LOAD Jan 16 17:59:49.217000 audit[2730]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2598 pid=2730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.217000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6566373132636335306230653266616264363765386535616666353663 Jan 16 17:59:49.224000 audit: BPF prog-id=103 op=LOAD Jan 16 17:59:49.225000 audit: BPF prog-id=104 op=LOAD Jan 16 17:59:49.225000 audit[2742]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2631 pid=2742 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.225000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966616139353831383438326635386166313263666636623235363663 Jan 16 17:59:49.225000 audit: BPF prog-id=104 op=UNLOAD Jan 16 17:59:49.225000 audit[2742]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2631 pid=2742 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.225000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966616139353831383438326635386166313263666636623235363663 Jan 16 17:59:49.225000 audit: BPF prog-id=105 op=LOAD Jan 16 17:59:49.225000 audit[2742]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2631 pid=2742 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.225000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966616139353831383438326635386166313263666636623235363663 Jan 16 17:59:49.225000 audit: BPF prog-id=106 op=LOAD Jan 16 17:59:49.225000 audit[2742]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2631 pid=2742 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.225000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966616139353831383438326635386166313263666636623235363663 Jan 16 17:59:49.225000 audit: BPF prog-id=106 op=UNLOAD Jan 16 17:59:49.225000 audit[2742]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2631 pid=2742 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.225000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966616139353831383438326635386166313263666636623235363663 Jan 16 17:59:49.225000 audit: BPF prog-id=105 op=UNLOAD Jan 16 17:59:49.225000 audit[2742]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2631 pid=2742 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.225000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966616139353831383438326635386166313263666636623235363663 Jan 16 17:59:49.225000 audit: BPF prog-id=107 op=LOAD Jan 16 17:59:49.225000 audit[2742]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2631 pid=2742 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.225000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966616139353831383438326635386166313263666636623235363663 Jan 16 17:59:49.227000 audit: BPF prog-id=108 op=LOAD Jan 16 17:59:49.228000 audit: BPF prog-id=109 op=LOAD Jan 16 17:59:49.228000 audit[2743]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=2660 pid=2743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962663636383263333865656166653166393165306438643234666636 Jan 16 17:59:49.228000 audit: BPF prog-id=109 op=UNLOAD Jan 16 17:59:49.228000 audit[2743]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2660 pid=2743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962663636383263333865656166653166393165306438643234666636 Jan 16 17:59:49.228000 audit: BPF prog-id=110 op=LOAD Jan 16 17:59:49.228000 audit[2743]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=2660 pid=2743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962663636383263333865656166653166393165306438643234666636 Jan 16 17:59:49.228000 audit: BPF prog-id=111 op=LOAD Jan 16 17:59:49.228000 audit[2743]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=2660 pid=2743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962663636383263333865656166653166393165306438643234666636 Jan 16 17:59:49.228000 audit: BPF prog-id=111 op=UNLOAD Jan 16 17:59:49.228000 audit[2743]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2660 pid=2743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962663636383263333865656166653166393165306438643234666636 Jan 16 17:59:49.228000 audit: BPF prog-id=110 op=UNLOAD Jan 16 17:59:49.228000 audit[2743]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2660 pid=2743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962663636383263333865656166653166393165306438643234666636 Jan 16 17:59:49.228000 audit: BPF prog-id=112 op=LOAD Jan 16 17:59:49.228000 audit[2743]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=2660 pid=2743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:49.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962663636383263333865656166653166393165306438643234666636 Jan 16 17:59:49.247271 containerd[1658]: time="2026-01-16T17:59:49.247228643Z" level=info msg="StartContainer for \"ef712cc50b0e2fabd67e8e5aff56c5d775115d00a2a217aa1382163b2808a919\" returns successfully" Jan 16 17:59:49.260733 containerd[1658]: time="2026-01-16T17:59:49.260701964Z" level=info msg="StartContainer for \"9faa95818482f58af12cff6b2566c18653b9fe7925f9e1535b63a59d8fa9c546\" returns successfully" Jan 16 17:59:49.263688 containerd[1658]: time="2026-01-16T17:59:49.263655693Z" level=info msg="StartContainer for \"9bf6682c38eeafe1f91e0d8d24ff6f42acc1ff588bf5c6ddda9ebaabc6b16041\" returns successfully" Jan 16 17:59:49.338147 kubelet[2559]: I0116 17:59:49.337946 2559 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:49.338521 kubelet[2559]: E0116 17:59:49.338486 2559 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.7.62:6443/api/v1/nodes\": dial tcp 10.0.7.62:6443: connect: connection refused" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:49.610143 kubelet[2559]: E0116 17:59:49.610055 2559 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-7f6b5ebc40\" not found" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:49.612648 kubelet[2559]: E0116 17:59:49.612612 2559 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-7f6b5ebc40\" not found" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:49.614675 kubelet[2559]: E0116 17:59:49.614656 2559 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-7f6b5ebc40\" not found" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:50.140973 kubelet[2559]: I0116 17:59:50.140901 2559 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:50.614884 kubelet[2559]: E0116 17:59:50.614855 2559 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-7f6b5ebc40\" not found" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:50.614999 kubelet[2559]: E0116 17:59:50.614917 2559 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-7f6b5ebc40\" not found" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:50.824904 kubelet[2559]: E0116 17:59:50.824859 2559 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4580-0-0-p-7f6b5ebc40\" not found" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:50.906334 kubelet[2559]: I0116 17:59:50.906169 2559 kubelet_node_status.go:78] "Successfully registered node" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:50.906334 kubelet[2559]: E0116 17:59:50.906206 2559 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4580-0-0-p-7f6b5ebc40\": node \"ci-4580-0-0-p-7f6b5ebc40\" not found" Jan 16 17:59:50.981504 kubelet[2559]: I0116 17:59:50.981451 2559 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:50.987599 kubelet[2559]: E0116 17:59:50.987572 2559 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4580-0-0-p-7f6b5ebc40\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:50.987599 kubelet[2559]: I0116 17:59:50.987598 2559 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:50.989069 kubelet[2559]: E0116 17:59:50.989039 2559 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:50.989069 kubelet[2559]: I0116 17:59:50.989061 2559 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:50.990432 kubelet[2559]: E0116 17:59:50.990392 2559 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4580-0-0-p-7f6b5ebc40\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:51.571775 kubelet[2559]: I0116 17:59:51.571653 2559 apiserver.go:52] "Watching apiserver" Jan 16 17:59:51.579297 kubelet[2559]: I0116 17:59:51.579254 2559 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 16 17:59:51.615236 kubelet[2559]: I0116 17:59:51.615210 2559 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:51.621983 kubelet[2559]: E0116 17:59:51.621942 2559 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4580-0-0-p-7f6b5ebc40\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:53.035385 systemd[1]: Reload requested from client PID 2830 ('systemctl') (unit session-10.scope)... Jan 16 17:59:53.035403 systemd[1]: Reloading... Jan 16 17:59:53.114520 zram_generator::config[2876]: No configuration found. Jan 16 17:59:53.295594 systemd[1]: Reloading finished in 259 ms. Jan 16 17:59:53.322335 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:59:53.331826 systemd[1]: kubelet.service: Deactivated successfully. Jan 16 17:59:53.333453 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:59:53.333000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:53.333524 systemd[1]: kubelet.service: Consumed 1.348s CPU time, 127.6M memory peak. Jan 16 17:59:53.334604 kernel: kauditd_printk_skb: 204 callbacks suppressed Jan 16 17:59:53.334673 kernel: audit: type=1131 audit(1768586393.333:402): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:53.335355 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:59:53.336000 audit: BPF prog-id=113 op=LOAD Jan 16 17:59:53.337769 kernel: audit: type=1334 audit(1768586393.336:403): prog-id=113 op=LOAD Jan 16 17:59:53.337809 kernel: audit: type=1334 audit(1768586393.336:404): prog-id=114 op=LOAD Jan 16 17:59:53.336000 audit: BPF prog-id=114 op=LOAD Jan 16 17:59:53.336000 audit: BPF prog-id=65 op=UNLOAD Jan 16 17:59:53.336000 audit: BPF prog-id=66 op=UNLOAD Jan 16 17:59:53.338000 audit: BPF prog-id=115 op=LOAD Jan 16 17:59:53.338000 audit: BPF prog-id=80 op=UNLOAD Jan 16 17:59:53.338000 audit: BPF prog-id=116 op=LOAD Jan 16 17:59:53.340150 kernel: audit: type=1334 audit(1768586393.336:405): prog-id=65 op=UNLOAD Jan 16 17:59:53.340179 kernel: audit: type=1334 audit(1768586393.336:406): prog-id=66 op=UNLOAD Jan 16 17:59:53.340200 kernel: audit: type=1334 audit(1768586393.338:407): prog-id=115 op=LOAD Jan 16 17:59:53.340216 kernel: audit: type=1334 audit(1768586393.338:408): prog-id=80 op=UNLOAD Jan 16 17:59:53.340232 kernel: audit: type=1334 audit(1768586393.338:409): prog-id=116 op=LOAD Jan 16 17:59:53.340000 audit: BPF prog-id=117 op=LOAD Jan 16 17:59:53.343061 kernel: audit: type=1334 audit(1768586393.340:410): prog-id=117 op=LOAD Jan 16 17:59:53.343091 kernel: audit: type=1334 audit(1768586393.340:411): prog-id=81 op=UNLOAD Jan 16 17:59:53.340000 audit: BPF prog-id=81 op=UNLOAD Jan 16 17:59:53.340000 audit: BPF prog-id=82 op=UNLOAD Jan 16 17:59:53.341000 audit: BPF prog-id=118 op=LOAD Jan 16 17:59:53.350000 audit: BPF prog-id=79 op=UNLOAD Jan 16 17:59:53.351000 audit: BPF prog-id=119 op=LOAD Jan 16 17:59:53.351000 audit: BPF prog-id=76 op=UNLOAD Jan 16 17:59:53.351000 audit: BPF prog-id=120 op=LOAD Jan 16 17:59:53.351000 audit: BPF prog-id=121 op=LOAD Jan 16 17:59:53.351000 audit: BPF prog-id=77 op=UNLOAD Jan 16 17:59:53.351000 audit: BPF prog-id=78 op=UNLOAD Jan 16 17:59:53.351000 audit: BPF prog-id=122 op=LOAD Jan 16 17:59:53.351000 audit: BPF prog-id=67 op=UNLOAD Jan 16 17:59:53.352000 audit: BPF prog-id=123 op=LOAD Jan 16 17:59:53.352000 audit: BPF prog-id=124 op=LOAD Jan 16 17:59:53.352000 audit: BPF prog-id=68 op=UNLOAD Jan 16 17:59:53.352000 audit: BPF prog-id=69 op=UNLOAD Jan 16 17:59:53.353000 audit: BPF prog-id=125 op=LOAD Jan 16 17:59:53.353000 audit: BPF prog-id=64 op=UNLOAD Jan 16 17:59:53.353000 audit: BPF prog-id=126 op=LOAD Jan 16 17:59:53.353000 audit: BPF prog-id=63 op=UNLOAD Jan 16 17:59:53.354000 audit: BPF prog-id=127 op=LOAD Jan 16 17:59:53.354000 audit: BPF prog-id=70 op=UNLOAD Jan 16 17:59:53.355000 audit: BPF prog-id=128 op=LOAD Jan 16 17:59:53.355000 audit: BPF prog-id=129 op=LOAD Jan 16 17:59:53.355000 audit: BPF prog-id=71 op=UNLOAD Jan 16 17:59:53.355000 audit: BPF prog-id=72 op=UNLOAD Jan 16 17:59:53.355000 audit: BPF prog-id=130 op=LOAD Jan 16 17:59:53.355000 audit: BPF prog-id=73 op=UNLOAD Jan 16 17:59:53.355000 audit: BPF prog-id=131 op=LOAD Jan 16 17:59:53.355000 audit: BPF prog-id=132 op=LOAD Jan 16 17:59:53.355000 audit: BPF prog-id=74 op=UNLOAD Jan 16 17:59:53.355000 audit: BPF prog-id=75 op=UNLOAD Jan 16 17:59:53.492255 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:59:53.492000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:53.509048 (kubelet)[2921]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 16 17:59:53.546513 kubelet[2921]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 16 17:59:53.546513 kubelet[2921]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 16 17:59:53.546513 kubelet[2921]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 16 17:59:53.546513 kubelet[2921]: I0116 17:59:53.546444 2921 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 16 17:59:53.553946 kubelet[2921]: I0116 17:59:53.553911 2921 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jan 16 17:59:53.553946 kubelet[2921]: I0116 17:59:53.553941 2921 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 16 17:59:53.554204 kubelet[2921]: I0116 17:59:53.554188 2921 server.go:954] "Client rotation is on, will bootstrap in background" Jan 16 17:59:53.555445 kubelet[2921]: I0116 17:59:53.555409 2921 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 16 17:59:53.557771 kubelet[2921]: I0116 17:59:53.557637 2921 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 16 17:59:53.561455 kubelet[2921]: I0116 17:59:53.561433 2921 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 16 17:59:53.563993 kubelet[2921]: I0116 17:59:53.563966 2921 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 16 17:59:53.564194 kubelet[2921]: I0116 17:59:53.564151 2921 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 16 17:59:53.564345 kubelet[2921]: I0116 17:59:53.564177 2921 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4580-0-0-p-7f6b5ebc40","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 16 17:59:53.564345 kubelet[2921]: I0116 17:59:53.564343 2921 topology_manager.go:138] "Creating topology manager with none policy" Jan 16 17:59:53.564467 kubelet[2921]: I0116 17:59:53.564353 2921 container_manager_linux.go:304] "Creating device plugin manager" Jan 16 17:59:53.564467 kubelet[2921]: I0116 17:59:53.564396 2921 state_mem.go:36] "Initialized new in-memory state store" Jan 16 17:59:53.564579 kubelet[2921]: I0116 17:59:53.564552 2921 kubelet.go:446] "Attempting to sync node with API server" Jan 16 17:59:53.564579 kubelet[2921]: I0116 17:59:53.564564 2921 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 16 17:59:53.565219 kubelet[2921]: I0116 17:59:53.564583 2921 kubelet.go:352] "Adding apiserver pod source" Jan 16 17:59:53.565219 kubelet[2921]: I0116 17:59:53.564593 2921 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 16 17:59:53.565307 kubelet[2921]: I0116 17:59:53.565224 2921 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 16 17:59:53.566141 kubelet[2921]: I0116 17:59:53.566118 2921 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 16 17:59:53.566596 kubelet[2921]: I0116 17:59:53.566575 2921 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 16 17:59:53.566636 kubelet[2921]: I0116 17:59:53.566611 2921 server.go:1287] "Started kubelet" Jan 16 17:59:53.567577 kubelet[2921]: I0116 17:59:53.567525 2921 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jan 16 17:59:53.567630 kubelet[2921]: I0116 17:59:53.567576 2921 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 16 17:59:53.567888 kubelet[2921]: I0116 17:59:53.567861 2921 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 16 17:59:53.572956 kubelet[2921]: E0116 17:59:53.570957 2921 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 16 17:59:53.572956 kubelet[2921]: I0116 17:59:53.571778 2921 server.go:479] "Adding debug handlers to kubelet server" Jan 16 17:59:53.572956 kubelet[2921]: I0116 17:59:53.572228 2921 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 16 17:59:53.573887 kubelet[2921]: I0116 17:59:53.573493 2921 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 16 17:59:53.574769 kubelet[2921]: I0116 17:59:53.574739 2921 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 16 17:59:53.574933 kubelet[2921]: E0116 17:59:53.574910 2921 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4580-0-0-p-7f6b5ebc40\" not found" Jan 16 17:59:53.575123 kubelet[2921]: I0116 17:59:53.575108 2921 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 16 17:59:53.575645 kubelet[2921]: I0116 17:59:53.575625 2921 reconciler.go:26] "Reconciler: start to sync state" Jan 16 17:59:53.578027 kubelet[2921]: I0116 17:59:53.577567 2921 factory.go:221] Registration of the systemd container factory successfully Jan 16 17:59:53.578027 kubelet[2921]: I0116 17:59:53.577677 2921 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 16 17:59:53.586830 kubelet[2921]: I0116 17:59:53.586756 2921 factory.go:221] Registration of the containerd container factory successfully Jan 16 17:59:53.598881 kubelet[2921]: I0116 17:59:53.598842 2921 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 16 17:59:53.600043 kubelet[2921]: I0116 17:59:53.600021 2921 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 16 17:59:53.600147 kubelet[2921]: I0116 17:59:53.600136 2921 status_manager.go:227] "Starting to sync pod status with apiserver" Jan 16 17:59:53.600219 kubelet[2921]: I0116 17:59:53.600207 2921 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 16 17:59:53.600268 kubelet[2921]: I0116 17:59:53.600260 2921 kubelet.go:2382] "Starting kubelet main sync loop" Jan 16 17:59:53.600362 kubelet[2921]: E0116 17:59:53.600341 2921 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 16 17:59:53.626103 kubelet[2921]: I0116 17:59:53.626074 2921 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 16 17:59:53.626103 kubelet[2921]: I0116 17:59:53.626095 2921 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 16 17:59:53.626251 kubelet[2921]: I0116 17:59:53.626118 2921 state_mem.go:36] "Initialized new in-memory state store" Jan 16 17:59:53.626304 kubelet[2921]: I0116 17:59:53.626286 2921 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 16 17:59:53.626330 kubelet[2921]: I0116 17:59:53.626302 2921 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 16 17:59:53.626330 kubelet[2921]: I0116 17:59:53.626321 2921 policy_none.go:49] "None policy: Start" Jan 16 17:59:53.626330 kubelet[2921]: I0116 17:59:53.626330 2921 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 16 17:59:53.626393 kubelet[2921]: I0116 17:59:53.626340 2921 state_mem.go:35] "Initializing new in-memory state store" Jan 16 17:59:53.626470 kubelet[2921]: I0116 17:59:53.626459 2921 state_mem.go:75] "Updated machine memory state" Jan 16 17:59:53.630232 kubelet[2921]: I0116 17:59:53.629836 2921 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 16 17:59:53.630232 kubelet[2921]: I0116 17:59:53.629989 2921 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 16 17:59:53.630232 kubelet[2921]: I0116 17:59:53.630000 2921 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 16 17:59:53.630385 kubelet[2921]: I0116 17:59:53.630371 2921 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 16 17:59:53.631441 kubelet[2921]: E0116 17:59:53.631408 2921 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 16 17:59:53.701335 kubelet[2921]: I0116 17:59:53.701283 2921 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:53.701485 kubelet[2921]: I0116 17:59:53.701376 2921 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:53.701626 kubelet[2921]: I0116 17:59:53.701612 2921 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:53.732583 kubelet[2921]: I0116 17:59:53.732556 2921 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:53.739337 kubelet[2921]: I0116 17:59:53.739284 2921 kubelet_node_status.go:124] "Node was previously registered" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:53.739441 kubelet[2921]: I0116 17:59:53.739370 2921 kubelet_node_status.go:78] "Successfully registered node" node="ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:53.777086 kubelet[2921]: I0116 17:59:53.777038 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9d4bef785720e8fe9fa0ba95ba727392-ca-certs\") pod \"kube-apiserver-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"9d4bef785720e8fe9fa0ba95ba727392\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:53.777086 kubelet[2921]: I0116 17:59:53.777089 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9d4bef785720e8fe9fa0ba95ba727392-k8s-certs\") pod \"kube-apiserver-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"9d4bef785720e8fe9fa0ba95ba727392\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:53.777244 kubelet[2921]: I0116 17:59:53.777111 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c1e3b217a329036b0919e1d721717bf4-flexvolume-dir\") pod \"kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"c1e3b217a329036b0919e1d721717bf4\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:53.777244 kubelet[2921]: I0116 17:59:53.777129 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c1e3b217a329036b0919e1d721717bf4-k8s-certs\") pod \"kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"c1e3b217a329036b0919e1d721717bf4\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:53.777244 kubelet[2921]: I0116 17:59:53.777145 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c1e3b217a329036b0919e1d721717bf4-kubeconfig\") pod \"kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"c1e3b217a329036b0919e1d721717bf4\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:53.777244 kubelet[2921]: I0116 17:59:53.777160 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e05f0c3cd193c08a433dbb0fa21854cf-kubeconfig\") pod \"kube-scheduler-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"e05f0c3cd193c08a433dbb0fa21854cf\") " pod="kube-system/kube-scheduler-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:53.777244 kubelet[2921]: I0116 17:59:53.777176 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9d4bef785720e8fe9fa0ba95ba727392-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"9d4bef785720e8fe9fa0ba95ba727392\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:53.777349 kubelet[2921]: I0116 17:59:53.777192 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c1e3b217a329036b0919e1d721717bf4-ca-certs\") pod \"kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"c1e3b217a329036b0919e1d721717bf4\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:53.777349 kubelet[2921]: I0116 17:59:53.777223 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c1e3b217a329036b0919e1d721717bf4-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40\" (UID: \"c1e3b217a329036b0919e1d721717bf4\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40" Jan 16 17:59:54.565869 kubelet[2921]: I0116 17:59:54.565773 2921 apiserver.go:52] "Watching apiserver" Jan 16 17:59:54.575901 kubelet[2921]: I0116 17:59:54.575856 2921 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 16 17:59:54.632971 kubelet[2921]: I0116 17:59:54.632894 2921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4580-0-0-p-7f6b5ebc40" podStartSLOduration=1.632875904 podStartE2EDuration="1.632875904s" podCreationTimestamp="2026-01-16 17:59:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 17:59:54.621806311 +0000 UTC m=+1.109593598" watchObservedRunningTime="2026-01-16 17:59:54.632875904 +0000 UTC m=+1.120663231" Jan 16 17:59:54.642334 kubelet[2921]: I0116 17:59:54.642287 2921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4580-0-0-p-7f6b5ebc40" podStartSLOduration=1.6422729729999999 podStartE2EDuration="1.642272973s" podCreationTimestamp="2026-01-16 17:59:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 17:59:54.641158209 +0000 UTC m=+1.128945496" watchObservedRunningTime="2026-01-16 17:59:54.642272973 +0000 UTC m=+1.130060260" Jan 16 17:59:54.642556 kubelet[2921]: I0116 17:59:54.642372 2921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4580-0-0-p-7f6b5ebc40" podStartSLOduration=1.642367973 podStartE2EDuration="1.642367973s" podCreationTimestamp="2026-01-16 17:59:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 17:59:54.633157465 +0000 UTC m=+1.120944752" watchObservedRunningTime="2026-01-16 17:59:54.642367973 +0000 UTC m=+1.130155260" Jan 16 17:59:59.714407 kubelet[2921]: I0116 17:59:59.714343 2921 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 16 17:59:59.714755 containerd[1658]: time="2026-01-16T17:59:59.714676342Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 16 17:59:59.714915 kubelet[2921]: I0116 17:59:59.714817 2921 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 16 18:00:00.681523 systemd[1]: Created slice kubepods-besteffort-pod6c63da54_e81c_4268_8692_a0000575d89d.slice - libcontainer container kubepods-besteffort-pod6c63da54_e81c_4268_8692_a0000575d89d.slice. Jan 16 18:00:00.719182 kubelet[2921]: I0116 18:00:00.719015 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/6c63da54-e81c-4268-8692-a0000575d89d-kube-proxy\") pod \"kube-proxy-nmszg\" (UID: \"6c63da54-e81c-4268-8692-a0000575d89d\") " pod="kube-system/kube-proxy-nmszg" Jan 16 18:00:00.719182 kubelet[2921]: I0116 18:00:00.719068 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6c63da54-e81c-4268-8692-a0000575d89d-xtables-lock\") pod \"kube-proxy-nmszg\" (UID: \"6c63da54-e81c-4268-8692-a0000575d89d\") " pod="kube-system/kube-proxy-nmszg" Jan 16 18:00:00.719182 kubelet[2921]: I0116 18:00:00.719087 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6c63da54-e81c-4268-8692-a0000575d89d-lib-modules\") pod \"kube-proxy-nmszg\" (UID: \"6c63da54-e81c-4268-8692-a0000575d89d\") " pod="kube-system/kube-proxy-nmszg" Jan 16 18:00:00.719182 kubelet[2921]: I0116 18:00:00.719106 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdjpq\" (UniqueName: \"kubernetes.io/projected/6c63da54-e81c-4268-8692-a0000575d89d-kube-api-access-xdjpq\") pod \"kube-proxy-nmszg\" (UID: \"6c63da54-e81c-4268-8692-a0000575d89d\") " pod="kube-system/kube-proxy-nmszg" Jan 16 18:00:00.791506 systemd[1]: Created slice kubepods-besteffort-podfeb47256_4ec4_4448_befd_f12516c738d1.slice - libcontainer container kubepods-besteffort-podfeb47256_4ec4_4448_befd_f12516c738d1.slice. Jan 16 18:00:00.819494 kubelet[2921]: I0116 18:00:00.819376 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/feb47256-4ec4-4448-befd-f12516c738d1-var-lib-calico\") pod \"tigera-operator-7dcd859c48-hlvph\" (UID: \"feb47256-4ec4-4448-befd-f12516c738d1\") " pod="tigera-operator/tigera-operator-7dcd859c48-hlvph" Jan 16 18:00:00.819494 kubelet[2921]: I0116 18:00:00.819495 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w8kt\" (UniqueName: \"kubernetes.io/projected/feb47256-4ec4-4448-befd-f12516c738d1-kube-api-access-4w8kt\") pod \"tigera-operator-7dcd859c48-hlvph\" (UID: \"feb47256-4ec4-4448-befd-f12516c738d1\") " pod="tigera-operator/tigera-operator-7dcd859c48-hlvph" Jan 16 18:00:00.996376 containerd[1658]: time="2026-01-16T18:00:00.996121784Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-nmszg,Uid:6c63da54-e81c-4268-8692-a0000575d89d,Namespace:kube-system,Attempt:0,}" Jan 16 18:00:01.083926 containerd[1658]: time="2026-01-16T18:00:01.083881570Z" level=info msg="connecting to shim 6f3e97c4e027a8694735780e016f56380235ecd9fb431a59a2c872370c3e7581" address="unix:///run/containerd/s/d48e1e7aabbaa36a5a932edc866b00e6991538325761592f87117315a98acb2c" namespace=k8s.io protocol=ttrpc version=3 Jan 16 18:00:01.095677 containerd[1658]: time="2026-01-16T18:00:01.095611606Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-hlvph,Uid:feb47256-4ec4-4448-befd-f12516c738d1,Namespace:tigera-operator,Attempt:0,}" Jan 16 18:00:01.113819 systemd[1]: Started cri-containerd-6f3e97c4e027a8694735780e016f56380235ecd9fb431a59a2c872370c3e7581.scope - libcontainer container 6f3e97c4e027a8694735780e016f56380235ecd9fb431a59a2c872370c3e7581. Jan 16 18:00:01.123000 audit: BPF prog-id=133 op=LOAD Jan 16 18:00:01.124469 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 16 18:00:01.124526 kernel: audit: type=1334 audit(1768586401.123:444): prog-id=133 op=LOAD Jan 16 18:00:01.126217 kernel: audit: type=1334 audit(1768586401.123:445): prog-id=134 op=LOAD Jan 16 18:00:01.126263 kernel: audit: type=1300 audit(1768586401.123:445): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2979 pid=2990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.123000 audit: BPF prog-id=134 op=LOAD Jan 16 18:00:01.123000 audit[2990]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2979 pid=2990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.123000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666336539376334653032376138363934373335373830653031366635 Jan 16 18:00:01.133614 kernel: audit: type=1327 audit(1768586401.123:445): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666336539376334653032376138363934373335373830653031366635 Jan 16 18:00:01.124000 audit: BPF prog-id=134 op=UNLOAD Jan 16 18:00:01.134676 kernel: audit: type=1334 audit(1768586401.124:446): prog-id=134 op=UNLOAD Jan 16 18:00:01.134766 kernel: audit: type=1300 audit(1768586401.124:446): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2979 pid=2990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.124000 audit[2990]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2979 pid=2990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.124000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666336539376334653032376138363934373335373830653031366635 Jan 16 18:00:01.141962 kernel: audit: type=1327 audit(1768586401.124:446): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666336539376334653032376138363934373335373830653031366635 Jan 16 18:00:01.142031 kernel: audit: type=1334 audit(1768586401.124:447): prog-id=135 op=LOAD Jan 16 18:00:01.124000 audit: BPF prog-id=135 op=LOAD Jan 16 18:00:01.124000 audit[2990]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2979 pid=2990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.146410 kernel: audit: type=1300 audit(1768586401.124:447): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2979 pid=2990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.124000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666336539376334653032376138363934373335373830653031366635 Jan 16 18:00:01.150146 kernel: audit: type=1327 audit(1768586401.124:447): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666336539376334653032376138363934373335373830653031366635 Jan 16 18:00:01.124000 audit: BPF prog-id=136 op=LOAD Jan 16 18:00:01.124000 audit[2990]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2979 pid=2990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.124000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666336539376334653032376138363934373335373830653031366635 Jan 16 18:00:01.129000 audit: BPF prog-id=136 op=UNLOAD Jan 16 18:00:01.129000 audit[2990]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2979 pid=2990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666336539376334653032376138363934373335373830653031366635 Jan 16 18:00:01.129000 audit: BPF prog-id=135 op=UNLOAD Jan 16 18:00:01.129000 audit[2990]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2979 pid=2990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666336539376334653032376138363934373335373830653031366635 Jan 16 18:00:01.129000 audit: BPF prog-id=137 op=LOAD Jan 16 18:00:01.129000 audit[2990]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2979 pid=2990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666336539376334653032376138363934373335373830653031366635 Jan 16 18:00:01.169620 containerd[1658]: time="2026-01-16T18:00:01.169580870Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-nmszg,Uid:6c63da54-e81c-4268-8692-a0000575d89d,Namespace:kube-system,Attempt:0,} returns sandbox id \"6f3e97c4e027a8694735780e016f56380235ecd9fb431a59a2c872370c3e7581\"" Jan 16 18:00:01.177752 containerd[1658]: time="2026-01-16T18:00:01.177716734Z" level=info msg="CreateContainer within sandbox \"6f3e97c4e027a8694735780e016f56380235ecd9fb431a59a2c872370c3e7581\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 16 18:00:01.188277 containerd[1658]: time="2026-01-16T18:00:01.188236246Z" level=info msg="connecting to shim c251e3cff3b9fc6edd0485af3e7894b38b01802328d56ea2304862e369604a2d" address="unix:///run/containerd/s/3ad9b1bba85fc187dd2de2bc469d137db727bb5570ed6e4303c1ec5259ca0681" namespace=k8s.io protocol=ttrpc version=3 Jan 16 18:00:01.209641 systemd[1]: Started cri-containerd-c251e3cff3b9fc6edd0485af3e7894b38b01802328d56ea2304862e369604a2d.scope - libcontainer container c251e3cff3b9fc6edd0485af3e7894b38b01802328d56ea2304862e369604a2d. Jan 16 18:00:01.217166 containerd[1658]: time="2026-01-16T18:00:01.217101094Z" level=info msg="Container e1e2c1bf997f7b4afd7fb0a2fda7e5c623cb00864c4aeac717176e64b3d15376: CDI devices from CRI Config.CDIDevices: []" Jan 16 18:00:01.221000 audit: BPF prog-id=138 op=LOAD Jan 16 18:00:01.222000 audit: BPF prog-id=139 op=LOAD Jan 16 18:00:01.222000 audit[3033]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3023 pid=3033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.222000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332353165336366663362396663366564643034383561663365373839 Jan 16 18:00:01.222000 audit: BPF prog-id=139 op=UNLOAD Jan 16 18:00:01.222000 audit[3033]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=3033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.222000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332353165336366663362396663366564643034383561663365373839 Jan 16 18:00:01.222000 audit: BPF prog-id=140 op=LOAD Jan 16 18:00:01.222000 audit[3033]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3023 pid=3033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.222000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332353165336366663362396663366564643034383561663365373839 Jan 16 18:00:01.222000 audit: BPF prog-id=141 op=LOAD Jan 16 18:00:01.222000 audit[3033]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3023 pid=3033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.222000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332353165336366663362396663366564643034383561663365373839 Jan 16 18:00:01.223000 audit: BPF prog-id=141 op=UNLOAD Jan 16 18:00:01.223000 audit[3033]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=3033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.223000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332353165336366663362396663366564643034383561663365373839 Jan 16 18:00:01.223000 audit: BPF prog-id=140 op=UNLOAD Jan 16 18:00:01.223000 audit[3033]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=3033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.223000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332353165336366663362396663366564643034383561663365373839 Jan 16 18:00:01.223000 audit: BPF prog-id=142 op=LOAD Jan 16 18:00:01.223000 audit[3033]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3023 pid=3033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.223000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332353165336366663362396663366564643034383561663365373839 Jan 16 18:00:01.249015 containerd[1658]: time="2026-01-16T18:00:01.248814110Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-hlvph,Uid:feb47256-4ec4-4448-befd-f12516c738d1,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"c251e3cff3b9fc6edd0485af3e7894b38b01802328d56ea2304862e369604a2d\"" Jan 16 18:00:01.250938 containerd[1658]: time="2026-01-16T18:00:01.250848676Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 16 18:00:01.252198 containerd[1658]: time="2026-01-16T18:00:01.250996236Z" level=info msg="CreateContainer within sandbox \"6f3e97c4e027a8694735780e016f56380235ecd9fb431a59a2c872370c3e7581\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"e1e2c1bf997f7b4afd7fb0a2fda7e5c623cb00864c4aeac717176e64b3d15376\"" Jan 16 18:00:01.252853 containerd[1658]: time="2026-01-16T18:00:01.252824842Z" level=info msg="StartContainer for \"e1e2c1bf997f7b4afd7fb0a2fda7e5c623cb00864c4aeac717176e64b3d15376\"" Jan 16 18:00:01.256235 containerd[1658]: time="2026-01-16T18:00:01.256189652Z" level=info msg="connecting to shim e1e2c1bf997f7b4afd7fb0a2fda7e5c623cb00864c4aeac717176e64b3d15376" address="unix:///run/containerd/s/d48e1e7aabbaa36a5a932edc866b00e6991538325761592f87117315a98acb2c" protocol=ttrpc version=3 Jan 16 18:00:01.276885 systemd[1]: Started cri-containerd-e1e2c1bf997f7b4afd7fb0a2fda7e5c623cb00864c4aeac717176e64b3d15376.scope - libcontainer container e1e2c1bf997f7b4afd7fb0a2fda7e5c623cb00864c4aeac717176e64b3d15376. Jan 16 18:00:01.349000 audit: BPF prog-id=143 op=LOAD Jan 16 18:00:01.349000 audit[3059]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2979 pid=3059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531653263316266393937663762346166643766623061326664613765 Jan 16 18:00:01.349000 audit: BPF prog-id=144 op=LOAD Jan 16 18:00:01.349000 audit[3059]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2979 pid=3059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531653263316266393937663762346166643766623061326664613765 Jan 16 18:00:01.349000 audit: BPF prog-id=144 op=UNLOAD Jan 16 18:00:01.349000 audit[3059]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2979 pid=3059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531653263316266393937663762346166643766623061326664613765 Jan 16 18:00:01.349000 audit: BPF prog-id=143 op=UNLOAD Jan 16 18:00:01.349000 audit[3059]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2979 pid=3059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531653263316266393937663762346166643766623061326664613765 Jan 16 18:00:01.349000 audit: BPF prog-id=145 op=LOAD Jan 16 18:00:01.349000 audit[3059]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2979 pid=3059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531653263316266393937663762346166643766623061326664613765 Jan 16 18:00:01.366469 containerd[1658]: time="2026-01-16T18:00:01.366419306Z" level=info msg="StartContainer for \"e1e2c1bf997f7b4afd7fb0a2fda7e5c623cb00864c4aeac717176e64b3d15376\" returns successfully" Jan 16 18:00:01.523000 audit[3123]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3123 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.523000 audit[3123]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd4a04890 a2=0 a3=1 items=0 ppid=3072 pid=3123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.523000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 16 18:00:01.523000 audit[3124]: NETFILTER_CFG table=mangle:55 family=10 entries=1 op=nft_register_chain pid=3124 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.523000 audit[3124]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffffec6c2d0 a2=0 a3=1 items=0 ppid=3072 pid=3124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.523000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 16 18:00:01.524000 audit[3126]: NETFILTER_CFG table=nat:56 family=2 entries=1 op=nft_register_chain pid=3126 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.524000 audit[3126]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc22e2fc0 a2=0 a3=1 items=0 ppid=3072 pid=3126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.524000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 16 18:00:01.525000 audit[3127]: NETFILTER_CFG table=nat:57 family=10 entries=1 op=nft_register_chain pid=3127 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.525000 audit[3127]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd00ec0a0 a2=0 a3=1 items=0 ppid=3072 pid=3127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.525000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 16 18:00:01.526000 audit[3128]: NETFILTER_CFG table=filter:58 family=10 entries=1 op=nft_register_chain pid=3128 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.526000 audit[3128]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff5e6e190 a2=0 a3=1 items=0 ppid=3072 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.526000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 16 18:00:01.529000 audit[3129]: NETFILTER_CFG table=filter:59 family=2 entries=1 op=nft_register_chain pid=3129 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.529000 audit[3129]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffdfb97210 a2=0 a3=1 items=0 ppid=3072 pid=3129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.529000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 16 18:00:01.628000 audit[3130]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3130 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.628000 audit[3130]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffcfec5670 a2=0 a3=1 items=0 ppid=3072 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.628000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 16 18:00:01.633000 audit[3132]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3132 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.633000 audit[3132]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=fffffc9d9c80 a2=0 a3=1 items=0 ppid=3072 pid=3132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.633000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 16 18:00:01.638000 audit[3135]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3135 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.638000 audit[3135]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffd04b6100 a2=0 a3=1 items=0 ppid=3072 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.638000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 16 18:00:01.639000 audit[3136]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3136 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.639000 audit[3136]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe79f5530 a2=0 a3=1 items=0 ppid=3072 pid=3136 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.639000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 16 18:00:01.641047 kubelet[2921]: I0116 18:00:01.640806 2921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-nmszg" podStartSLOduration=1.640789297 podStartE2EDuration="1.640789297s" podCreationTimestamp="2026-01-16 18:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 18:00:01.640746497 +0000 UTC m=+8.128533784" watchObservedRunningTime="2026-01-16 18:00:01.640789297 +0000 UTC m=+8.128576584" Jan 16 18:00:01.643000 audit[3138]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3138 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.643000 audit[3138]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffe8567320 a2=0 a3=1 items=0 ppid=3072 pid=3138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.643000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 16 18:00:01.644000 audit[3139]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3139 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.644000 audit[3139]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe538bd70 a2=0 a3=1 items=0 ppid=3072 pid=3139 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.644000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 16 18:00:01.647000 audit[3141]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3141 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.647000 audit[3141]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=fffffe5c7e30 a2=0 a3=1 items=0 ppid=3072 pid=3141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.647000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 16 18:00:01.651000 audit[3144]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3144 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.651000 audit[3144]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffffed9600 a2=0 a3=1 items=0 ppid=3072 pid=3144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.651000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 16 18:00:01.652000 audit[3145]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3145 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.652000 audit[3145]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe15f0b70 a2=0 a3=1 items=0 ppid=3072 pid=3145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.652000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 16 18:00:01.654000 audit[3147]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3147 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.654000 audit[3147]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffdb65b9e0 a2=0 a3=1 items=0 ppid=3072 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.654000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 16 18:00:01.655000 audit[3148]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3148 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.655000 audit[3148]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff448e960 a2=0 a3=1 items=0 ppid=3072 pid=3148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.655000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 16 18:00:01.658000 audit[3150]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3150 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.658000 audit[3150]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffe7b96610 a2=0 a3=1 items=0 ppid=3072 pid=3150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.658000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 16 18:00:01.661000 audit[3153]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3153 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.661000 audit[3153]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffde945e50 a2=0 a3=1 items=0 ppid=3072 pid=3153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.661000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 16 18:00:01.665000 audit[3156]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3156 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.665000 audit[3156]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffffc0ffe50 a2=0 a3=1 items=0 ppid=3072 pid=3156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.665000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 16 18:00:01.666000 audit[3157]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3157 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.666000 audit[3157]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd9457f20 a2=0 a3=1 items=0 ppid=3072 pid=3157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.666000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 16 18:00:01.668000 audit[3159]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3159 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.668000 audit[3159]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffd54d3d90 a2=0 a3=1 items=0 ppid=3072 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.668000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 16 18:00:01.672000 audit[3162]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3162 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.672000 audit[3162]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffc3a75170 a2=0 a3=1 items=0 ppid=3072 pid=3162 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.672000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 16 18:00:01.673000 audit[3163]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3163 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.673000 audit[3163]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffcd461350 a2=0 a3=1 items=0 ppid=3072 pid=3163 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.673000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 16 18:00:01.675000 audit[3165]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3165 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 18:00:01.675000 audit[3165]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=fffffe5c7200 a2=0 a3=1 items=0 ppid=3072 pid=3165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.675000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 16 18:00:01.698000 audit[3171]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3171 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:01.698000 audit[3171]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff273eeb0 a2=0 a3=1 items=0 ppid=3072 pid=3171 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.698000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:01.710000 audit[3171]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3171 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:01.710000 audit[3171]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=fffff273eeb0 a2=0 a3=1 items=0 ppid=3072 pid=3171 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.710000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:01.711000 audit[3176]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3176 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.711000 audit[3176]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffe768b240 a2=0 a3=1 items=0 ppid=3072 pid=3176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.711000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 16 18:00:01.714000 audit[3178]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3178 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.714000 audit[3178]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=ffffc72a00a0 a2=0 a3=1 items=0 ppid=3072 pid=3178 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.714000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 16 18:00:01.718000 audit[3181]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3181 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.718000 audit[3181]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffe3391080 a2=0 a3=1 items=0 ppid=3072 pid=3181 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.718000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 16 18:00:01.719000 audit[3182]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3182 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.719000 audit[3182]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffac17380 a2=0 a3=1 items=0 ppid=3072 pid=3182 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.719000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 16 18:00:01.721000 audit[3184]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3184 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.721000 audit[3184]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffc5651040 a2=0 a3=1 items=0 ppid=3072 pid=3184 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.721000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 16 18:00:01.723000 audit[3185]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3185 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.723000 audit[3185]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe04614d0 a2=0 a3=1 items=0 ppid=3072 pid=3185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.723000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 16 18:00:01.725000 audit[3187]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3187 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.725000 audit[3187]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffc9eb9070 a2=0 a3=1 items=0 ppid=3072 pid=3187 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.725000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 16 18:00:01.729000 audit[3190]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3190 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.729000 audit[3190]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=fffff6713fe0 a2=0 a3=1 items=0 ppid=3072 pid=3190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.729000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 16 18:00:01.730000 audit[3191]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3191 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.730000 audit[3191]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc1e252d0 a2=0 a3=1 items=0 ppid=3072 pid=3191 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.730000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 16 18:00:01.734000 audit[3193]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3193 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.734000 audit[3193]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffe5726250 a2=0 a3=1 items=0 ppid=3072 pid=3193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.734000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 16 18:00:01.735000 audit[3194]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3194 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.735000 audit[3194]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe5c4ef00 a2=0 a3=1 items=0 ppid=3072 pid=3194 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.735000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 16 18:00:01.737000 audit[3196]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3196 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.737000 audit[3196]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffcf832f80 a2=0 a3=1 items=0 ppid=3072 pid=3196 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.737000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 16 18:00:01.741000 audit[3199]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3199 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.741000 audit[3199]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd7037ee0 a2=0 a3=1 items=0 ppid=3072 pid=3199 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.741000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 16 18:00:01.744000 audit[3202]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3202 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.744000 audit[3202]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffcd723c70 a2=0 a3=1 items=0 ppid=3072 pid=3202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.744000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 16 18:00:01.745000 audit[3203]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3203 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.745000 audit[3203]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffe679f330 a2=0 a3=1 items=0 ppid=3072 pid=3203 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.745000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 16 18:00:01.747000 audit[3205]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3205 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.747000 audit[3205]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=fffff16180a0 a2=0 a3=1 items=0 ppid=3072 pid=3205 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.747000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 16 18:00:01.751000 audit[3208]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3208 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.751000 audit[3208]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffdb69fa60 a2=0 a3=1 items=0 ppid=3072 pid=3208 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.751000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 16 18:00:01.752000 audit[3209]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3209 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.752000 audit[3209]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffea3eab0 a2=0 a3=1 items=0 ppid=3072 pid=3209 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.752000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 16 18:00:01.754000 audit[3211]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3211 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.754000 audit[3211]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=ffffc76b6780 a2=0 a3=1 items=0 ppid=3072 pid=3211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.754000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 16 18:00:01.755000 audit[3212]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3212 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.755000 audit[3212]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffcc7a93d0 a2=0 a3=1 items=0 ppid=3072 pid=3212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.755000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 16 18:00:01.758000 audit[3214]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3214 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.758000 audit[3214]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffd7be16c0 a2=0 a3=1 items=0 ppid=3072 pid=3214 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.758000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 16 18:00:01.761000 audit[3217]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3217 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 18:00:01.761000 audit[3217]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffd18cd1b0 a2=0 a3=1 items=0 ppid=3072 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.761000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 16 18:00:01.764000 audit[3219]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3219 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 16 18:00:01.764000 audit[3219]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2088 a0=3 a1=ffffc6c4ac20 a2=0 a3=1 items=0 ppid=3072 pid=3219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.764000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:01.765000 audit[3219]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3219 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 16 18:00:01.765000 audit[3219]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=ffffc6c4ac20 a2=0 a3=1 items=0 ppid=3072 pid=3219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:01.765000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:03.907587 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount852595217.mount: Deactivated successfully. Jan 16 18:00:04.183342 containerd[1658]: time="2026-01-16T18:00:04.182778239Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 18:00:04.184848 containerd[1658]: time="2026-01-16T18:00:04.184791125Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=20773434" Jan 16 18:00:04.186316 containerd[1658]: time="2026-01-16T18:00:04.186282130Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 18:00:04.188658 containerd[1658]: time="2026-01-16T18:00:04.188593497Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 18:00:04.189327 containerd[1658]: time="2026-01-16T18:00:04.189296659Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 2.938407423s" Jan 16 18:00:04.189381 containerd[1658]: time="2026-01-16T18:00:04.189327539Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Jan 16 18:00:04.192292 containerd[1658]: time="2026-01-16T18:00:04.192176348Z" level=info msg="CreateContainer within sandbox \"c251e3cff3b9fc6edd0485af3e7894b38b01802328d56ea2304862e369604a2d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 16 18:00:04.200359 containerd[1658]: time="2026-01-16T18:00:04.199803971Z" level=info msg="Container f7613897bdfe27a3309792e3ca6c3cdfdac7c69f80eca7431c266102fe31f56f: CDI devices from CRI Config.CDIDevices: []" Jan 16 18:00:04.203163 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3894580469.mount: Deactivated successfully. Jan 16 18:00:04.207030 containerd[1658]: time="2026-01-16T18:00:04.206982472Z" level=info msg="CreateContainer within sandbox \"c251e3cff3b9fc6edd0485af3e7894b38b01802328d56ea2304862e369604a2d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"f7613897bdfe27a3309792e3ca6c3cdfdac7c69f80eca7431c266102fe31f56f\"" Jan 16 18:00:04.207462 containerd[1658]: time="2026-01-16T18:00:04.207401354Z" level=info msg="StartContainer for \"f7613897bdfe27a3309792e3ca6c3cdfdac7c69f80eca7431c266102fe31f56f\"" Jan 16 18:00:04.210817 containerd[1658]: time="2026-01-16T18:00:04.210757844Z" level=info msg="connecting to shim f7613897bdfe27a3309792e3ca6c3cdfdac7c69f80eca7431c266102fe31f56f" address="unix:///run/containerd/s/3ad9b1bba85fc187dd2de2bc469d137db727bb5570ed6e4303c1ec5259ca0681" protocol=ttrpc version=3 Jan 16 18:00:04.247848 systemd[1]: Started cri-containerd-f7613897bdfe27a3309792e3ca6c3cdfdac7c69f80eca7431c266102fe31f56f.scope - libcontainer container f7613897bdfe27a3309792e3ca6c3cdfdac7c69f80eca7431c266102fe31f56f. Jan 16 18:00:04.257000 audit: BPF prog-id=146 op=LOAD Jan 16 18:00:04.258000 audit: BPF prog-id=147 op=LOAD Jan 16 18:00:04.258000 audit[3229]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=3023 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:04.258000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637363133383937626466653237613333303937393265336361366333 Jan 16 18:00:04.258000 audit: BPF prog-id=147 op=UNLOAD Jan 16 18:00:04.258000 audit[3229]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:04.258000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637363133383937626466653237613333303937393265336361366333 Jan 16 18:00:04.258000 audit: BPF prog-id=148 op=LOAD Jan 16 18:00:04.258000 audit[3229]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=3023 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:04.258000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637363133383937626466653237613333303937393265336361366333 Jan 16 18:00:04.258000 audit: BPF prog-id=149 op=LOAD Jan 16 18:00:04.258000 audit[3229]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=3023 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:04.258000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637363133383937626466653237613333303937393265336361366333 Jan 16 18:00:04.258000 audit: BPF prog-id=149 op=UNLOAD Jan 16 18:00:04.258000 audit[3229]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:04.258000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637363133383937626466653237613333303937393265336361366333 Jan 16 18:00:04.258000 audit: BPF prog-id=148 op=UNLOAD Jan 16 18:00:04.258000 audit[3229]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:04.258000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637363133383937626466653237613333303937393265336361366333 Jan 16 18:00:04.258000 audit: BPF prog-id=150 op=LOAD Jan 16 18:00:04.258000 audit[3229]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=3023 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:04.258000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6637363133383937626466653237613333303937393265336361366333 Jan 16 18:00:04.274725 containerd[1658]: time="2026-01-16T18:00:04.274690078Z" level=info msg="StartContainer for \"f7613897bdfe27a3309792e3ca6c3cdfdac7c69f80eca7431c266102fe31f56f\" returns successfully" Jan 16 18:00:04.648874 kubelet[2921]: I0116 18:00:04.648819 2921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-hlvph" podStartSLOduration=1.709121345 podStartE2EDuration="4.648788451s" podCreationTimestamp="2026-01-16 18:00:00 +0000 UTC" firstStartedPulling="2026-01-16 18:00:01.250449635 +0000 UTC m=+7.738236922" lastFinishedPulling="2026-01-16 18:00:04.190116741 +0000 UTC m=+10.677904028" observedRunningTime="2026-01-16 18:00:04.64852473 +0000 UTC m=+11.136312017" watchObservedRunningTime="2026-01-16 18:00:04.648788451 +0000 UTC m=+11.136575698" Jan 16 18:00:09.541265 sudo[1960]: pam_unix(sudo:session): session closed for user root Jan 16 18:00:09.542766 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 16 18:00:09.542825 kernel: audit: type=1106 audit(1768586409.540:524): pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 18:00:09.540000 audit[1960]: USER_END pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 18:00:09.540000 audit[1960]: CRED_DISP pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 18:00:09.547652 kernel: audit: type=1104 audit(1768586409.540:525): pid=1960 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 18:00:09.637446 sshd[1959]: Connection closed by 4.153.228.146 port 50154 Jan 16 18:00:09.637793 sshd-session[1955]: pam_unix(sshd:session): session closed for user core Jan 16 18:00:09.638000 audit[1955]: USER_END pid=1955 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:00:09.638000 audit[1955]: CRED_DISP pid=1955 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:00:09.644745 systemd[1]: sshd@8-10.0.7.62:22-4.153.228.146:50154.service: Deactivated successfully. Jan 16 18:00:09.646813 systemd[1]: session-10.scope: Deactivated successfully. Jan 16 18:00:09.647096 systemd[1]: session-10.scope: Consumed 7.932s CPU time, 227.8M memory peak. Jan 16 18:00:09.647793 kernel: audit: type=1106 audit(1768586409.638:526): pid=1955 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:00:09.647951 kernel: audit: type=1104 audit(1768586409.638:527): pid=1955 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:00:09.647977 kernel: audit: type=1131 audit(1768586409.644:528): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.7.62:22-4.153.228.146:50154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:00:09.644000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.7.62:22-4.153.228.146:50154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:00:09.649411 systemd-logind[1644]: Session 10 logged out. Waiting for processes to exit. Jan 16 18:00:09.651664 systemd-logind[1644]: Removed session 10. Jan 16 18:00:11.240000 audit[3317]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3317 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:11.240000 audit[3317]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffcd2c3aa0 a2=0 a3=1 items=0 ppid=3072 pid=3317 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:11.247109 kernel: audit: type=1325 audit(1768586411.240:529): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3317 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:11.247175 kernel: audit: type=1300 audit(1768586411.240:529): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffcd2c3aa0 a2=0 a3=1 items=0 ppid=3072 pid=3317 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:11.240000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:11.249186 kernel: audit: type=1327 audit(1768586411.240:529): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:11.249000 audit[3317]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3317 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:11.260447 kernel: audit: type=1325 audit(1768586411.249:530): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3317 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:11.249000 audit[3317]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffcd2c3aa0 a2=0 a3=1 items=0 ppid=3072 pid=3317 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:11.249000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:11.268449 kernel: audit: type=1300 audit(1768586411.249:530): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffcd2c3aa0 a2=0 a3=1 items=0 ppid=3072 pid=3317 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:11.274000 audit[3319]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3319 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:11.274000 audit[3319]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=fffff5a07e90 a2=0 a3=1 items=0 ppid=3072 pid=3319 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:11.274000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:11.279000 audit[3319]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3319 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:11.279000 audit[3319]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff5a07e90 a2=0 a3=1 items=0 ppid=3072 pid=3319 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:11.279000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:15.717000 audit[3321]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3321 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:15.718603 kernel: kauditd_printk_skb: 7 callbacks suppressed Jan 16 18:00:15.718635 kernel: audit: type=1325 audit(1768586415.717:533): table=filter:109 family=2 entries=17 op=nft_register_rule pid=3321 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:15.717000 audit[3321]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffeac1add0 a2=0 a3=1 items=0 ppid=3072 pid=3321 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:15.732448 kernel: audit: type=1300 audit(1768586415.717:533): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffeac1add0 a2=0 a3=1 items=0 ppid=3072 pid=3321 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:15.717000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:15.737449 kernel: audit: type=1327 audit(1768586415.717:533): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:15.738000 audit[3321]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3321 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:15.738000 audit[3321]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffeac1add0 a2=0 a3=1 items=0 ppid=3072 pid=3321 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:15.745173 kernel: audit: type=1325 audit(1768586415.738:534): table=nat:110 family=2 entries=12 op=nft_register_rule pid=3321 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:15.745242 kernel: audit: type=1300 audit(1768586415.738:534): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffeac1add0 a2=0 a3=1 items=0 ppid=3072 pid=3321 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:15.738000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:15.747076 kernel: audit: type=1327 audit(1768586415.738:534): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:15.755000 audit[3323]: NETFILTER_CFG table=filter:111 family=2 entries=18 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:15.755000 audit[3323]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffff508a3d0 a2=0 a3=1 items=0 ppid=3072 pid=3323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:15.762123 kernel: audit: type=1325 audit(1768586415.755:535): table=filter:111 family=2 entries=18 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:15.762179 kernel: audit: type=1300 audit(1768586415.755:535): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffff508a3d0 a2=0 a3=1 items=0 ppid=3072 pid=3323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:15.762198 kernel: audit: type=1327 audit(1768586415.755:535): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:15.755000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:15.769000 audit[3323]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:15.769000 audit[3323]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff508a3d0 a2=0 a3=1 items=0 ppid=3072 pid=3323 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:15.769000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:15.772455 kernel: audit: type=1325 audit(1768586415.769:536): table=nat:112 family=2 entries=12 op=nft_register_rule pid=3323 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:16.791000 audit[3325]: NETFILTER_CFG table=filter:113 family=2 entries=19 op=nft_register_rule pid=3325 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:16.791000 audit[3325]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffff79b5ff0 a2=0 a3=1 items=0 ppid=3072 pid=3325 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:16.791000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:16.798000 audit[3325]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3325 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:16.798000 audit[3325]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff79b5ff0 a2=0 a3=1 items=0 ppid=3072 pid=3325 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:16.798000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:18.612000 audit[3327]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:18.612000 audit[3327]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=fffffda22080 a2=0 a3=1 items=0 ppid=3072 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:18.612000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:18.617000 audit[3327]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:18.617000 audit[3327]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffffda22080 a2=0 a3=1 items=0 ppid=3072 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:18.617000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:18.647850 systemd[1]: Created slice kubepods-besteffort-pod8a703edd_134e_4ee0_8ae9_360513b87ac2.slice - libcontainer container kubepods-besteffort-pod8a703edd_134e_4ee0_8ae9_360513b87ac2.slice. Jan 16 18:00:18.823727 kubelet[2921]: I0116 18:00:18.823632 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a703edd-134e-4ee0-8ae9-360513b87ac2-tigera-ca-bundle\") pod \"calico-typha-6c85564bc7-jmxgc\" (UID: \"8a703edd-134e-4ee0-8ae9-360513b87ac2\") " pod="calico-system/calico-typha-6c85564bc7-jmxgc" Jan 16 18:00:18.823727 kubelet[2921]: I0116 18:00:18.823679 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/8a703edd-134e-4ee0-8ae9-360513b87ac2-typha-certs\") pod \"calico-typha-6c85564bc7-jmxgc\" (UID: \"8a703edd-134e-4ee0-8ae9-360513b87ac2\") " pod="calico-system/calico-typha-6c85564bc7-jmxgc" Jan 16 18:00:18.823727 kubelet[2921]: I0116 18:00:18.823701 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcwlt\" (UniqueName: \"kubernetes.io/projected/8a703edd-134e-4ee0-8ae9-360513b87ac2-kube-api-access-jcwlt\") pod \"calico-typha-6c85564bc7-jmxgc\" (UID: \"8a703edd-134e-4ee0-8ae9-360513b87ac2\") " pod="calico-system/calico-typha-6c85564bc7-jmxgc" Jan 16 18:00:18.828195 systemd[1]: Created slice kubepods-besteffort-pod09d81b66_2b5e_4347_bd6a_b47c71f2fae2.slice - libcontainer container kubepods-besteffort-pod09d81b66_2b5e_4347_bd6a_b47c71f2fae2.slice. Jan 16 18:00:18.924407 kubelet[2921]: I0116 18:00:18.924316 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/09d81b66-2b5e-4347-bd6a-b47c71f2fae2-lib-modules\") pod \"calico-node-sx7js\" (UID: \"09d81b66-2b5e-4347-bd6a-b47c71f2fae2\") " pod="calico-system/calico-node-sx7js" Jan 16 18:00:18.924407 kubelet[2921]: I0116 18:00:18.924371 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/09d81b66-2b5e-4347-bd6a-b47c71f2fae2-node-certs\") pod \"calico-node-sx7js\" (UID: \"09d81b66-2b5e-4347-bd6a-b47c71f2fae2\") " pod="calico-system/calico-node-sx7js" Jan 16 18:00:18.924407 kubelet[2921]: I0116 18:00:18.924390 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/09d81b66-2b5e-4347-bd6a-b47c71f2fae2-policysync\") pod \"calico-node-sx7js\" (UID: \"09d81b66-2b5e-4347-bd6a-b47c71f2fae2\") " pod="calico-system/calico-node-sx7js" Jan 16 18:00:18.924607 kubelet[2921]: I0116 18:00:18.924406 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/09d81b66-2b5e-4347-bd6a-b47c71f2fae2-var-run-calico\") pod \"calico-node-sx7js\" (UID: \"09d81b66-2b5e-4347-bd6a-b47c71f2fae2\") " pod="calico-system/calico-node-sx7js" Jan 16 18:00:18.924607 kubelet[2921]: I0116 18:00:18.924470 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/09d81b66-2b5e-4347-bd6a-b47c71f2fae2-flexvol-driver-host\") pod \"calico-node-sx7js\" (UID: \"09d81b66-2b5e-4347-bd6a-b47c71f2fae2\") " pod="calico-system/calico-node-sx7js" Jan 16 18:00:18.924607 kubelet[2921]: I0116 18:00:18.924497 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/09d81b66-2b5e-4347-bd6a-b47c71f2fae2-xtables-lock\") pod \"calico-node-sx7js\" (UID: \"09d81b66-2b5e-4347-bd6a-b47c71f2fae2\") " pod="calico-system/calico-node-sx7js" Jan 16 18:00:18.924607 kubelet[2921]: I0116 18:00:18.924526 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjrf7\" (UniqueName: \"kubernetes.io/projected/09d81b66-2b5e-4347-bd6a-b47c71f2fae2-kube-api-access-zjrf7\") pod \"calico-node-sx7js\" (UID: \"09d81b66-2b5e-4347-bd6a-b47c71f2fae2\") " pod="calico-system/calico-node-sx7js" Jan 16 18:00:18.924607 kubelet[2921]: I0116 18:00:18.924540 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d81b66-2b5e-4347-bd6a-b47c71f2fae2-tigera-ca-bundle\") pod \"calico-node-sx7js\" (UID: \"09d81b66-2b5e-4347-bd6a-b47c71f2fae2\") " pod="calico-system/calico-node-sx7js" Jan 16 18:00:18.924928 kubelet[2921]: I0116 18:00:18.924556 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/09d81b66-2b5e-4347-bd6a-b47c71f2fae2-cni-bin-dir\") pod \"calico-node-sx7js\" (UID: \"09d81b66-2b5e-4347-bd6a-b47c71f2fae2\") " pod="calico-system/calico-node-sx7js" Jan 16 18:00:18.924928 kubelet[2921]: I0116 18:00:18.924577 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/09d81b66-2b5e-4347-bd6a-b47c71f2fae2-cni-net-dir\") pod \"calico-node-sx7js\" (UID: \"09d81b66-2b5e-4347-bd6a-b47c71f2fae2\") " pod="calico-system/calico-node-sx7js" Jan 16 18:00:18.924928 kubelet[2921]: I0116 18:00:18.924595 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/09d81b66-2b5e-4347-bd6a-b47c71f2fae2-cni-log-dir\") pod \"calico-node-sx7js\" (UID: \"09d81b66-2b5e-4347-bd6a-b47c71f2fae2\") " pod="calico-system/calico-node-sx7js" Jan 16 18:00:18.924928 kubelet[2921]: I0116 18:00:18.924609 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/09d81b66-2b5e-4347-bd6a-b47c71f2fae2-var-lib-calico\") pod \"calico-node-sx7js\" (UID: \"09d81b66-2b5e-4347-bd6a-b47c71f2fae2\") " pod="calico-system/calico-node-sx7js" Jan 16 18:00:18.953361 containerd[1658]: time="2026-01-16T18:00:18.953319550Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6c85564bc7-jmxgc,Uid:8a703edd-134e-4ee0-8ae9-360513b87ac2,Namespace:calico-system,Attempt:0,}" Jan 16 18:00:18.974649 containerd[1658]: time="2026-01-16T18:00:18.974458534Z" level=info msg="connecting to shim 24c4634583ea7d42ef50d3cbbc2a4c5a2bf5fffd5680a85a65013d927f93b813" address="unix:///run/containerd/s/f0ef50d94a4a39976ecd1afe2ae45f1a30d6e960e18d5bc6cbe871a4bdf2aadc" namespace=k8s.io protocol=ttrpc version=3 Jan 16 18:00:18.997633 systemd[1]: Started cri-containerd-24c4634583ea7d42ef50d3cbbc2a4c5a2bf5fffd5680a85a65013d927f93b813.scope - libcontainer container 24c4634583ea7d42ef50d3cbbc2a4c5a2bf5fffd5680a85a65013d927f93b813. Jan 16 18:00:19.007000 audit: BPF prog-id=151 op=LOAD Jan 16 18:00:19.007000 audit: BPF prog-id=152 op=LOAD Jan 16 18:00:19.007000 audit[3349]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138180 a2=98 a3=0 items=0 ppid=3338 pid=3349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:19.007000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234633436333435383365613764343265663530643363626263326134 Jan 16 18:00:19.007000 audit: BPF prog-id=152 op=UNLOAD Jan 16 18:00:19.007000 audit[3349]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3338 pid=3349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:19.007000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234633436333435383365613764343265663530643363626263326134 Jan 16 18:00:19.007000 audit: BPF prog-id=153 op=LOAD Jan 16 18:00:19.007000 audit[3349]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001383e8 a2=98 a3=0 items=0 ppid=3338 pid=3349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:19.007000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234633436333435383365613764343265663530643363626263326134 Jan 16 18:00:19.007000 audit: BPF prog-id=154 op=LOAD Jan 16 18:00:19.007000 audit[3349]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000138168 a2=98 a3=0 items=0 ppid=3338 pid=3349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:19.007000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234633436333435383365613764343265663530643363626263326134 Jan 16 18:00:19.007000 audit: BPF prog-id=154 op=UNLOAD Jan 16 18:00:19.007000 audit[3349]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3338 pid=3349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:19.007000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234633436333435383365613764343265663530643363626263326134 Jan 16 18:00:19.007000 audit: BPF prog-id=153 op=UNLOAD Jan 16 18:00:19.007000 audit[3349]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3338 pid=3349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:19.007000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234633436333435383365613764343265663530643363626263326134 Jan 16 18:00:19.007000 audit: BPF prog-id=155 op=LOAD Jan 16 18:00:19.007000 audit[3349]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138648 a2=98 a3=0 items=0 ppid=3338 pid=3349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:19.007000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3234633436333435383365613764343265663530643363626263326134 Jan 16 18:00:19.026933 kubelet[2921]: E0116 18:00:19.026836 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.026933 kubelet[2921]: W0116 18:00:19.026865 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.026933 kubelet[2921]: E0116 18:00:19.026900 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.027268 kubelet[2921]: E0116 18:00:19.027219 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.027268 kubelet[2921]: W0116 18:00:19.027235 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.027268 kubelet[2921]: E0116 18:00:19.027248 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.031280 containerd[1658]: time="2026-01-16T18:00:19.031239546Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6c85564bc7-jmxgc,Uid:8a703edd-134e-4ee0-8ae9-360513b87ac2,Namespace:calico-system,Attempt:0,} returns sandbox id \"24c4634583ea7d42ef50d3cbbc2a4c5a2bf5fffd5680a85a65013d927f93b813\"" Jan 16 18:00:19.035447 kubelet[2921]: E0116 18:00:19.035046 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.035447 kubelet[2921]: W0116 18:00:19.035067 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.035447 kubelet[2921]: E0116 18:00:19.035084 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.037846 containerd[1658]: time="2026-01-16T18:00:19.037815606Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 16 18:00:19.043128 kubelet[2921]: E0116 18:00:19.043090 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.043128 kubelet[2921]: W0116 18:00:19.043114 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.043128 kubelet[2921]: E0116 18:00:19.043132 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.113766 kubelet[2921]: E0116 18:00:19.113655 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:00:19.126195 kubelet[2921]: E0116 18:00:19.126063 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.126195 kubelet[2921]: W0116 18:00:19.126086 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.126195 kubelet[2921]: E0116 18:00:19.126106 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.126514 kubelet[2921]: E0116 18:00:19.126499 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.126713 kubelet[2921]: W0116 18:00:19.126595 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.126713 kubelet[2921]: E0116 18:00:19.126659 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.126961 kubelet[2921]: E0116 18:00:19.126936 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.127092 kubelet[2921]: W0116 18:00:19.127038 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.127092 kubelet[2921]: E0116 18:00:19.127057 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.127483 kubelet[2921]: E0116 18:00:19.127468 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.127625 kubelet[2921]: W0116 18:00:19.127610 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.127810 kubelet[2921]: E0116 18:00:19.127708 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.128498 kubelet[2921]: E0116 18:00:19.128469 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.128974 kubelet[2921]: W0116 18:00:19.128644 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.129163 kubelet[2921]: E0116 18:00:19.129074 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.129308 kubelet[2921]: E0116 18:00:19.129295 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.129463 kubelet[2921]: W0116 18:00:19.129376 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.129463 kubelet[2921]: E0116 18:00:19.129392 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.129670 kubelet[2921]: E0116 18:00:19.129657 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.129841 kubelet[2921]: W0116 18:00:19.129730 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.129841 kubelet[2921]: E0116 18:00:19.129747 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.130019 kubelet[2921]: E0116 18:00:19.130006 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.130019 kubelet[2921]: W0116 18:00:19.130045 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.130019 kubelet[2921]: E0116 18:00:19.130060 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.130380 kubelet[2921]: E0116 18:00:19.130350 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.130532 kubelet[2921]: W0116 18:00:19.130452 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.130532 kubelet[2921]: E0116 18:00:19.130469 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.130744 kubelet[2921]: E0116 18:00:19.130730 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.130899 kubelet[2921]: W0116 18:00:19.130800 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.130899 kubelet[2921]: E0116 18:00:19.130815 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.131075 kubelet[2921]: E0116 18:00:19.131025 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.131075 kubelet[2921]: W0116 18:00:19.131037 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.131075 kubelet[2921]: E0116 18:00:19.131046 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.131479 kubelet[2921]: E0116 18:00:19.131319 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.131479 kubelet[2921]: W0116 18:00:19.131331 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.131479 kubelet[2921]: E0116 18:00:19.131341 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.131641 kubelet[2921]: E0116 18:00:19.131628 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.131713 kubelet[2921]: W0116 18:00:19.131701 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.131854 kubelet[2921]: E0116 18:00:19.131755 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.131962 kubelet[2921]: E0116 18:00:19.131951 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.132012 kubelet[2921]: W0116 18:00:19.132002 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.132134 kubelet[2921]: E0116 18:00:19.132054 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.132225 kubelet[2921]: E0116 18:00:19.132213 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.132289 kubelet[2921]: W0116 18:00:19.132278 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.132413 kubelet[2921]: E0116 18:00:19.132331 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.132530 kubelet[2921]: E0116 18:00:19.132517 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.132720 kubelet[2921]: W0116 18:00:19.132575 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.132720 kubelet[2921]: E0116 18:00:19.132590 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.132842 kubelet[2921]: E0116 18:00:19.132829 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.132902 kubelet[2921]: W0116 18:00:19.132890 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.132956 kubelet[2921]: E0116 18:00:19.132945 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.133140 kubelet[2921]: E0116 18:00:19.133128 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.133199 kubelet[2921]: W0116 18:00:19.133187 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.133251 kubelet[2921]: E0116 18:00:19.133241 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.134384 kubelet[2921]: E0116 18:00:19.134087 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.134384 kubelet[2921]: W0116 18:00:19.134104 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.134384 kubelet[2921]: E0116 18:00:19.134115 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.134640 kubelet[2921]: E0116 18:00:19.134602 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.134723 kubelet[2921]: W0116 18:00:19.134711 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.134807 kubelet[2921]: E0116 18:00:19.134796 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.135725 kubelet[2921]: E0116 18:00:19.135701 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.135725 kubelet[2921]: W0116 18:00:19.135718 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.135828 kubelet[2921]: E0116 18:00:19.135734 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.135828 kubelet[2921]: I0116 18:00:19.135767 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3-varrun\") pod \"csi-node-driver-rb578\" (UID: \"08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3\") " pod="calico-system/csi-node-driver-rb578" Jan 16 18:00:19.136557 containerd[1658]: time="2026-01-16T18:00:19.135747022Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-sx7js,Uid:09d81b66-2b5e-4347-bd6a-b47c71f2fae2,Namespace:calico-system,Attempt:0,}" Jan 16 18:00:19.136602 kubelet[2921]: E0116 18:00:19.135954 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.136602 kubelet[2921]: W0116 18:00:19.135963 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.136602 kubelet[2921]: E0116 18:00:19.135972 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.136602 kubelet[2921]: I0116 18:00:19.135986 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3-registration-dir\") pod \"csi-node-driver-rb578\" (UID: \"08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3\") " pod="calico-system/csi-node-driver-rb578" Jan 16 18:00:19.136602 kubelet[2921]: E0116 18:00:19.136502 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.136602 kubelet[2921]: W0116 18:00:19.136516 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.136602 kubelet[2921]: E0116 18:00:19.136543 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.136602 kubelet[2921]: I0116 18:00:19.136592 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3-kubelet-dir\") pod \"csi-node-driver-rb578\" (UID: \"08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3\") " pod="calico-system/csi-node-driver-rb578" Jan 16 18:00:19.137276 kubelet[2921]: E0116 18:00:19.137249 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.137276 kubelet[2921]: W0116 18:00:19.137271 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.137356 kubelet[2921]: E0116 18:00:19.137286 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.137992 kubelet[2921]: E0116 18:00:19.137964 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.137992 kubelet[2921]: W0116 18:00:19.137980 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.138069 kubelet[2921]: E0116 18:00:19.138049 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.138395 kubelet[2921]: E0116 18:00:19.138354 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.138395 kubelet[2921]: W0116 18:00:19.138370 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.138501 kubelet[2921]: E0116 18:00:19.138440 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.138713 kubelet[2921]: E0116 18:00:19.138692 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.138713 kubelet[2921]: W0116 18:00:19.138708 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.138883 kubelet[2921]: E0116 18:00:19.138750 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.138883 kubelet[2921]: I0116 18:00:19.138781 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csxnn\" (UniqueName: \"kubernetes.io/projected/08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3-kube-api-access-csxnn\") pod \"csi-node-driver-rb578\" (UID: \"08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3\") " pod="calico-system/csi-node-driver-rb578" Jan 16 18:00:19.139971 kubelet[2921]: E0116 18:00:19.139931 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.139971 kubelet[2921]: W0116 18:00:19.139963 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.140395 kubelet[2921]: E0116 18:00:19.140270 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.140395 kubelet[2921]: W0116 18:00:19.140288 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.140395 kubelet[2921]: E0116 18:00:19.140301 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.140395 kubelet[2921]: E0116 18:00:19.140365 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.140771 kubelet[2921]: E0116 18:00:19.140573 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.140829 kubelet[2921]: W0116 18:00:19.140775 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.140829 kubelet[2921]: E0116 18:00:19.140801 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.141306 kubelet[2921]: E0116 18:00:19.140965 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.141306 kubelet[2921]: W0116 18:00:19.140980 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.141306 kubelet[2921]: E0116 18:00:19.140992 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.141882 kubelet[2921]: E0116 18:00:19.141864 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.141992 kubelet[2921]: W0116 18:00:19.141971 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.142073 kubelet[2921]: E0116 18:00:19.142046 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.142566 kubelet[2921]: E0116 18:00:19.142515 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.142815 kubelet[2921]: W0116 18:00:19.142699 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.142815 kubelet[2921]: E0116 18:00:19.142726 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.142815 kubelet[2921]: I0116 18:00:19.142754 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3-socket-dir\") pod \"csi-node-driver-rb578\" (UID: \"08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3\") " pod="calico-system/csi-node-driver-rb578" Jan 16 18:00:19.143320 kubelet[2921]: E0116 18:00:19.143288 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.143407 kubelet[2921]: W0116 18:00:19.143389 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.143510 kubelet[2921]: E0116 18:00:19.143456 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.143772 kubelet[2921]: E0116 18:00:19.143759 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.143891 kubelet[2921]: W0116 18:00:19.143843 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.143891 kubelet[2921]: E0116 18:00:19.143869 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.172328 containerd[1658]: time="2026-01-16T18:00:19.171951132Z" level=info msg="connecting to shim e35fa994c2de84496e11e800d9ce4bb292ad3f89c1b81197964a8d077fe670db" address="unix:///run/containerd/s/205604bf089d31f7dc5cbcd5aa81bf33f6935ed1b77a0324ad25f9bcb9053459" namespace=k8s.io protocol=ttrpc version=3 Jan 16 18:00:19.192683 systemd[1]: Started cri-containerd-e35fa994c2de84496e11e800d9ce4bb292ad3f89c1b81197964a8d077fe670db.scope - libcontainer container e35fa994c2de84496e11e800d9ce4bb292ad3f89c1b81197964a8d077fe670db. Jan 16 18:00:19.203000 audit: BPF prog-id=156 op=LOAD Jan 16 18:00:19.204000 audit: BPF prog-id=157 op=LOAD Jan 16 18:00:19.204000 audit[3449]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001f0180 a2=98 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:19.204000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533356661393934633264653834343936653131653830306439636534 Jan 16 18:00:19.204000 audit: BPF prog-id=157 op=UNLOAD Jan 16 18:00:19.204000 audit[3449]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:19.204000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533356661393934633264653834343936653131653830306439636534 Jan 16 18:00:19.204000 audit: BPF prog-id=158 op=LOAD Jan 16 18:00:19.204000 audit[3449]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001f03e8 a2=98 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:19.204000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533356661393934633264653834343936653131653830306439636534 Jan 16 18:00:19.204000 audit: BPF prog-id=159 op=LOAD Jan 16 18:00:19.204000 audit[3449]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001f0168 a2=98 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:19.204000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533356661393934633264653834343936653131653830306439636534 Jan 16 18:00:19.204000 audit: BPF prog-id=159 op=UNLOAD Jan 16 18:00:19.204000 audit[3449]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:19.204000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533356661393934633264653834343936653131653830306439636534 Jan 16 18:00:19.204000 audit: BPF prog-id=158 op=UNLOAD Jan 16 18:00:19.204000 audit[3449]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:19.204000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533356661393934633264653834343936653131653830306439636534 Jan 16 18:00:19.204000 audit: BPF prog-id=160 op=LOAD Jan 16 18:00:19.204000 audit[3449]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001f0648 a2=98 a3=0 items=0 ppid=3437 pid=3449 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:19.204000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533356661393934633264653834343936653131653830306439636534 Jan 16 18:00:19.218897 containerd[1658]: time="2026-01-16T18:00:19.218853914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-sx7js,Uid:09d81b66-2b5e-4347-bd6a-b47c71f2fae2,Namespace:calico-system,Attempt:0,} returns sandbox id \"e35fa994c2de84496e11e800d9ce4bb292ad3f89c1b81197964a8d077fe670db\"" Jan 16 18:00:19.244068 kubelet[2921]: E0116 18:00:19.243939 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.244068 kubelet[2921]: W0116 18:00:19.243958 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.244068 kubelet[2921]: E0116 18:00:19.243975 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.244452 kubelet[2921]: E0116 18:00:19.244339 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.244452 kubelet[2921]: W0116 18:00:19.244353 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.244452 kubelet[2921]: E0116 18:00:19.244370 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.244606 kubelet[2921]: E0116 18:00:19.244586 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.244606 kubelet[2921]: W0116 18:00:19.244603 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.244721 kubelet[2921]: E0116 18:00:19.244633 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.244784 kubelet[2921]: E0116 18:00:19.244769 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.244784 kubelet[2921]: W0116 18:00:19.244781 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.244848 kubelet[2921]: E0116 18:00:19.244796 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.244935 kubelet[2921]: E0116 18:00:19.244924 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.244935 kubelet[2921]: W0116 18:00:19.244935 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.244989 kubelet[2921]: E0116 18:00:19.244947 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.245127 kubelet[2921]: E0116 18:00:19.245115 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.245160 kubelet[2921]: W0116 18:00:19.245127 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.245160 kubelet[2921]: E0116 18:00:19.245139 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.245282 kubelet[2921]: E0116 18:00:19.245272 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.245282 kubelet[2921]: W0116 18:00:19.245282 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.245336 kubelet[2921]: E0116 18:00:19.245294 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.245481 kubelet[2921]: E0116 18:00:19.245469 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.245481 kubelet[2921]: W0116 18:00:19.245481 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.245545 kubelet[2921]: E0116 18:00:19.245494 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.245645 kubelet[2921]: E0116 18:00:19.245624 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.245645 kubelet[2921]: W0116 18:00:19.245635 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.245645 kubelet[2921]: E0116 18:00:19.245667 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.245829 kubelet[2921]: E0116 18:00:19.245749 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.245829 kubelet[2921]: W0116 18:00:19.245757 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.245829 kubelet[2921]: E0116 18:00:19.245778 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.245893 kubelet[2921]: E0116 18:00:19.245874 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.245893 kubelet[2921]: W0116 18:00:19.245881 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.245893 kubelet[2921]: E0116 18:00:19.245890 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.246021 kubelet[2921]: E0116 18:00:19.246007 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.246021 kubelet[2921]: W0116 18:00:19.246017 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.246085 kubelet[2921]: E0116 18:00:19.246029 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.246157 kubelet[2921]: E0116 18:00:19.246145 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.246157 kubelet[2921]: W0116 18:00:19.246155 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.246214 kubelet[2921]: E0116 18:00:19.246166 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.246320 kubelet[2921]: E0116 18:00:19.246309 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.246320 kubelet[2921]: W0116 18:00:19.246320 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.246364 kubelet[2921]: E0116 18:00:19.246332 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.247263 kubelet[2921]: E0116 18:00:19.247244 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.247263 kubelet[2921]: W0116 18:00:19.247261 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.247320 kubelet[2921]: E0116 18:00:19.247281 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.247508 kubelet[2921]: E0116 18:00:19.247483 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.247547 kubelet[2921]: W0116 18:00:19.247510 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.247609 kubelet[2921]: E0116 18:00:19.247594 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.247724 kubelet[2921]: E0116 18:00:19.247712 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.247724 kubelet[2921]: W0116 18:00:19.247723 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.247812 kubelet[2921]: E0116 18:00:19.247795 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.247902 kubelet[2921]: E0116 18:00:19.247887 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.247902 kubelet[2921]: W0116 18:00:19.247899 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.247954 kubelet[2921]: E0116 18:00:19.247940 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.248169 kubelet[2921]: E0116 18:00:19.248108 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.248169 kubelet[2921]: W0116 18:00:19.248121 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.248169 kubelet[2921]: E0116 18:00:19.248135 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.248881 kubelet[2921]: E0116 18:00:19.248863 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.249060 kubelet[2921]: W0116 18:00:19.248932 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.249060 kubelet[2921]: E0116 18:00:19.248961 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.249232 kubelet[2921]: E0116 18:00:19.249205 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.250609 kubelet[2921]: W0116 18:00:19.250221 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.250609 kubelet[2921]: E0116 18:00:19.250295 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.250609 kubelet[2921]: E0116 18:00:19.250536 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.250609 kubelet[2921]: W0116 18:00:19.250549 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.250609 kubelet[2921]: E0116 18:00:19.250601 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.250819 kubelet[2921]: E0116 18:00:19.250806 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.250852 kubelet[2921]: W0116 18:00:19.250819 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.250915 kubelet[2921]: E0116 18:00:19.250898 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.251006 kubelet[2921]: E0116 18:00:19.250992 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.251042 kubelet[2921]: W0116 18:00:19.251017 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.251042 kubelet[2921]: E0116 18:00:19.251034 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.251230 kubelet[2921]: E0116 18:00:19.251218 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.251262 kubelet[2921]: W0116 18:00:19.251231 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.251262 kubelet[2921]: E0116 18:00:19.251241 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.256871 kubelet[2921]: E0116 18:00:19.256848 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:19.256871 kubelet[2921]: W0116 18:00:19.256868 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:19.256991 kubelet[2921]: E0116 18:00:19.256884 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:19.633000 audit[3502]: NETFILTER_CFG table=filter:117 family=2 entries=22 op=nft_register_rule pid=3502 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:19.633000 audit[3502]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffec3deaf0 a2=0 a3=1 items=0 ppid=3072 pid=3502 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:19.633000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:19.641000 audit[3502]: NETFILTER_CFG table=nat:118 family=2 entries=12 op=nft_register_rule pid=3502 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:19.641000 audit[3502]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffec3deaf0 a2=0 a3=1 items=0 ppid=3072 pid=3502 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:19.641000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:20.601393 kubelet[2921]: E0116 18:00:20.601340 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:00:22.601526 kubelet[2921]: E0116 18:00:22.600927 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:00:22.715710 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount690510365.mount: Deactivated successfully. Jan 16 18:00:22.998217 containerd[1658]: time="2026-01-16T18:00:22.998144724Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 18:00:22.999267 containerd[1658]: time="2026-01-16T18:00:22.999220327Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:23.000706 containerd[1658]: time="2026-01-16T18:00:23.000674852Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 18:00:23.003742 containerd[1658]: time="2026-01-16T18:00:23.003697381Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 18:00:23.004461 containerd[1658]: time="2026-01-16T18:00:23.004232942Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 3.966379096s" Jan 16 18:00:23.004461 containerd[1658]: time="2026-01-16T18:00:23.004285542Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Jan 16 18:00:23.005236 containerd[1658]: time="2026-01-16T18:00:23.005192425Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 16 18:00:23.012682 containerd[1658]: time="2026-01-16T18:00:23.012529887Z" level=info msg="CreateContainer within sandbox \"24c4634583ea7d42ef50d3cbbc2a4c5a2bf5fffd5680a85a65013d927f93b813\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 16 18:00:23.022316 containerd[1658]: time="2026-01-16T18:00:23.021454834Z" level=info msg="Container 83f561750b2e8a0f80cb5ab0933a51b70948d60b2f044b20058a2e2aecab5846: CDI devices from CRI Config.CDIDevices: []" Jan 16 18:00:23.028793 containerd[1658]: time="2026-01-16T18:00:23.028752657Z" level=info msg="CreateContainer within sandbox \"24c4634583ea7d42ef50d3cbbc2a4c5a2bf5fffd5680a85a65013d927f93b813\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"83f561750b2e8a0f80cb5ab0933a51b70948d60b2f044b20058a2e2aecab5846\"" Jan 16 18:00:23.029633 containerd[1658]: time="2026-01-16T18:00:23.029596979Z" level=info msg="StartContainer for \"83f561750b2e8a0f80cb5ab0933a51b70948d60b2f044b20058a2e2aecab5846\"" Jan 16 18:00:23.030653 containerd[1658]: time="2026-01-16T18:00:23.030619182Z" level=info msg="connecting to shim 83f561750b2e8a0f80cb5ab0933a51b70948d60b2f044b20058a2e2aecab5846" address="unix:///run/containerd/s/f0ef50d94a4a39976ecd1afe2ae45f1a30d6e960e18d5bc6cbe871a4bdf2aadc" protocol=ttrpc version=3 Jan 16 18:00:23.053587 systemd[1]: Started cri-containerd-83f561750b2e8a0f80cb5ab0933a51b70948d60b2f044b20058a2e2aecab5846.scope - libcontainer container 83f561750b2e8a0f80cb5ab0933a51b70948d60b2f044b20058a2e2aecab5846. Jan 16 18:00:23.064000 audit: BPF prog-id=161 op=LOAD Jan 16 18:00:23.065603 kernel: kauditd_printk_skb: 64 callbacks suppressed Jan 16 18:00:23.065661 kernel: audit: type=1334 audit(1768586423.064:559): prog-id=161 op=LOAD Jan 16 18:00:23.066000 audit: BPF prog-id=162 op=LOAD Jan 16 18:00:23.066000 audit[3513]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3338 pid=3513 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:23.070748 kernel: audit: type=1334 audit(1768586423.066:560): prog-id=162 op=LOAD Jan 16 18:00:23.070793 kernel: audit: type=1300 audit(1768586423.066:560): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3338 pid=3513 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:23.070812 kernel: audit: type=1327 audit(1768586423.066:560): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833663536313735306232653861306638306362356162303933336135 Jan 16 18:00:23.066000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833663536313735306232653861306638306362356162303933336135 Jan 16 18:00:23.066000 audit: BPF prog-id=162 op=UNLOAD Jan 16 18:00:23.066000 audit[3513]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3338 pid=3513 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:23.077460 kernel: audit: type=1334 audit(1768586423.066:561): prog-id=162 op=UNLOAD Jan 16 18:00:23.077559 kernel: audit: type=1300 audit(1768586423.066:561): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3338 pid=3513 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:23.077606 kernel: audit: type=1327 audit(1768586423.066:561): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833663536313735306232653861306638306362356162303933336135 Jan 16 18:00:23.066000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833663536313735306232653861306638306362356162303933336135 Jan 16 18:00:23.080886 kernel: audit: type=1334 audit(1768586423.066:562): prog-id=163 op=LOAD Jan 16 18:00:23.066000 audit: BPF prog-id=163 op=LOAD Jan 16 18:00:23.066000 audit[3513]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3338 pid=3513 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:23.085149 kernel: audit: type=1300 audit(1768586423.066:562): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3338 pid=3513 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:23.085240 kernel: audit: type=1327 audit(1768586423.066:562): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833663536313735306232653861306638306362356162303933336135 Jan 16 18:00:23.066000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833663536313735306232653861306638306362356162303933336135 Jan 16 18:00:23.067000 audit: BPF prog-id=164 op=LOAD Jan 16 18:00:23.067000 audit[3513]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3338 pid=3513 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:23.067000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833663536313735306232653861306638306362356162303933336135 Jan 16 18:00:23.073000 audit: BPF prog-id=164 op=UNLOAD Jan 16 18:00:23.073000 audit[3513]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3338 pid=3513 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:23.073000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833663536313735306232653861306638306362356162303933336135 Jan 16 18:00:23.073000 audit: BPF prog-id=163 op=UNLOAD Jan 16 18:00:23.073000 audit[3513]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3338 pid=3513 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:23.073000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833663536313735306232653861306638306362356162303933336135 Jan 16 18:00:23.073000 audit: BPF prog-id=165 op=LOAD Jan 16 18:00:23.073000 audit[3513]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3338 pid=3513 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:23.073000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833663536313735306232653861306638306362356162303933336135 Jan 16 18:00:23.110606 containerd[1658]: time="2026-01-16T18:00:23.110565344Z" level=info msg="StartContainer for \"83f561750b2e8a0f80cb5ab0933a51b70948d60b2f044b20058a2e2aecab5846\" returns successfully" Jan 16 18:00:23.703229 kubelet[2921]: I0116 18:00:23.703165 2921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6c85564bc7-jmxgc" podStartSLOduration=1.734994398 podStartE2EDuration="5.70314854s" podCreationTimestamp="2026-01-16 18:00:18 +0000 UTC" firstStartedPulling="2026-01-16 18:00:19.036853763 +0000 UTC m=+25.524641010" lastFinishedPulling="2026-01-16 18:00:23.005007905 +0000 UTC m=+29.492795152" observedRunningTime="2026-01-16 18:00:23.691172464 +0000 UTC m=+30.178959791" watchObservedRunningTime="2026-01-16 18:00:23.70314854 +0000 UTC m=+30.190935827" Jan 16 18:00:23.716000 audit[3559]: NETFILTER_CFG table=filter:119 family=2 entries=21 op=nft_register_rule pid=3559 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:23.716000 audit[3559]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffff0da6b00 a2=0 a3=1 items=0 ppid=3072 pid=3559 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:23.716000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:23.721000 audit[3559]: NETFILTER_CFG table=nat:120 family=2 entries=19 op=nft_register_chain pid=3559 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:23.721000 audit[3559]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=fffff0da6b00 a2=0 a3=1 items=0 ppid=3072 pid=3559 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:23.721000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:23.763601 kubelet[2921]: E0116 18:00:23.763574 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.763678 kubelet[2921]: W0116 18:00:23.763602 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.763678 kubelet[2921]: E0116 18:00:23.763630 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.764009 kubelet[2921]: E0116 18:00:23.763987 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.764065 kubelet[2921]: W0116 18:00:23.764016 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.764065 kubelet[2921]: E0116 18:00:23.764062 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.764231 kubelet[2921]: E0116 18:00:23.764220 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.764231 kubelet[2921]: W0116 18:00:23.764230 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.764299 kubelet[2921]: E0116 18:00:23.764239 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.764393 kubelet[2921]: E0116 18:00:23.764382 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.764393 kubelet[2921]: W0116 18:00:23.764393 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.764485 kubelet[2921]: E0116 18:00:23.764401 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.764577 kubelet[2921]: E0116 18:00:23.764537 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.764577 kubelet[2921]: W0116 18:00:23.764547 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.764577 kubelet[2921]: E0116 18:00:23.764555 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.764725 kubelet[2921]: E0116 18:00:23.764713 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.764725 kubelet[2921]: W0116 18:00:23.764724 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.764784 kubelet[2921]: E0116 18:00:23.764733 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.764912 kubelet[2921]: E0116 18:00:23.764900 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.764912 kubelet[2921]: W0116 18:00:23.764911 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.764961 kubelet[2921]: E0116 18:00:23.764919 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.765066 kubelet[2921]: E0116 18:00:23.765052 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.765066 kubelet[2921]: W0116 18:00:23.765062 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.765114 kubelet[2921]: E0116 18:00:23.765070 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.765351 kubelet[2921]: E0116 18:00:23.765339 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.765351 kubelet[2921]: W0116 18:00:23.765350 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.765417 kubelet[2921]: E0116 18:00:23.765359 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.765508 kubelet[2921]: E0116 18:00:23.765496 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.765508 kubelet[2921]: W0116 18:00:23.765506 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.765567 kubelet[2921]: E0116 18:00:23.765514 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.765647 kubelet[2921]: E0116 18:00:23.765637 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.765647 kubelet[2921]: W0116 18:00:23.765646 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.765694 kubelet[2921]: E0116 18:00:23.765653 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.765791 kubelet[2921]: E0116 18:00:23.765780 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.765791 kubelet[2921]: W0116 18:00:23.765789 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.765846 kubelet[2921]: E0116 18:00:23.765797 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.765928 kubelet[2921]: E0116 18:00:23.765917 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.765928 kubelet[2921]: W0116 18:00:23.765926 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.765980 kubelet[2921]: E0116 18:00:23.765933 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.766056 kubelet[2921]: E0116 18:00:23.766044 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.766056 kubelet[2921]: W0116 18:00:23.766054 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.766110 kubelet[2921]: E0116 18:00:23.766061 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.766186 kubelet[2921]: E0116 18:00:23.766175 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.766186 kubelet[2921]: W0116 18:00:23.766184 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.766236 kubelet[2921]: E0116 18:00:23.766191 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.776542 kubelet[2921]: E0116 18:00:23.776523 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.776542 kubelet[2921]: W0116 18:00:23.776539 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.776655 kubelet[2921]: E0116 18:00:23.776552 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.776975 kubelet[2921]: E0116 18:00:23.776961 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.776975 kubelet[2921]: W0116 18:00:23.776975 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.777028 kubelet[2921]: E0116 18:00:23.776988 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.777169 kubelet[2921]: E0116 18:00:23.777159 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.777197 kubelet[2921]: W0116 18:00:23.777169 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.777197 kubelet[2921]: E0116 18:00:23.777188 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.777375 kubelet[2921]: E0116 18:00:23.777365 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.777402 kubelet[2921]: W0116 18:00:23.777376 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.777402 kubelet[2921]: E0116 18:00:23.777393 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.777552 kubelet[2921]: E0116 18:00:23.777540 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.777552 kubelet[2921]: W0116 18:00:23.777551 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.777602 kubelet[2921]: E0116 18:00:23.777567 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.777713 kubelet[2921]: E0116 18:00:23.777703 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.777738 kubelet[2921]: W0116 18:00:23.777715 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.777738 kubelet[2921]: E0116 18:00:23.777726 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.777885 kubelet[2921]: E0116 18:00:23.777875 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.777915 kubelet[2921]: W0116 18:00:23.777885 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.777952 kubelet[2921]: E0116 18:00:23.777932 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.778066 kubelet[2921]: E0116 18:00:23.778053 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.778066 kubelet[2921]: W0116 18:00:23.778065 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.778122 kubelet[2921]: E0116 18:00:23.778109 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.778258 kubelet[2921]: E0116 18:00:23.778247 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.778280 kubelet[2921]: W0116 18:00:23.778258 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.778280 kubelet[2921]: E0116 18:00:23.778273 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.778432 kubelet[2921]: E0116 18:00:23.778415 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.778462 kubelet[2921]: W0116 18:00:23.778435 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.778462 kubelet[2921]: E0116 18:00:23.778448 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.778591 kubelet[2921]: E0116 18:00:23.778580 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.778613 kubelet[2921]: W0116 18:00:23.778591 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.778613 kubelet[2921]: E0116 18:00:23.778602 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.778754 kubelet[2921]: E0116 18:00:23.778743 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.778754 kubelet[2921]: W0116 18:00:23.778752 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.778797 kubelet[2921]: E0116 18:00:23.778769 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.779177 kubelet[2921]: E0116 18:00:23.779162 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.779205 kubelet[2921]: W0116 18:00:23.779176 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.779205 kubelet[2921]: E0116 18:00:23.779191 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.779347 kubelet[2921]: E0116 18:00:23.779336 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.779347 kubelet[2921]: W0116 18:00:23.779347 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.779402 kubelet[2921]: E0116 18:00:23.779385 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.779538 kubelet[2921]: E0116 18:00:23.779525 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.779538 kubelet[2921]: W0116 18:00:23.779537 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.779601 kubelet[2921]: E0116 18:00:23.779588 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.779705 kubelet[2921]: E0116 18:00:23.779694 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.779729 kubelet[2921]: W0116 18:00:23.779704 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.779729 kubelet[2921]: E0116 18:00:23.779723 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.779887 kubelet[2921]: E0116 18:00:23.779876 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.779909 kubelet[2921]: W0116 18:00:23.779887 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.779909 kubelet[2921]: E0116 18:00:23.779895 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:23.780200 kubelet[2921]: E0116 18:00:23.780186 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:23.780229 kubelet[2921]: W0116 18:00:23.780200 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:23.780229 kubelet[2921]: E0116 18:00:23.780210 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.602072 kubelet[2921]: E0116 18:00:24.601119 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:00:24.773123 kubelet[2921]: E0116 18:00:24.773093 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.773123 kubelet[2921]: W0116 18:00:24.773120 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.773480 kubelet[2921]: E0116 18:00:24.773142 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.773480 kubelet[2921]: E0116 18:00:24.773418 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.773480 kubelet[2921]: W0116 18:00:24.773443 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.773480 kubelet[2921]: E0116 18:00:24.773452 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.773594 kubelet[2921]: E0116 18:00:24.773584 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.773641 kubelet[2921]: W0116 18:00:24.773594 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.773641 kubelet[2921]: E0116 18:00:24.773604 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.773736 kubelet[2921]: E0116 18:00:24.773726 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.773736 kubelet[2921]: W0116 18:00:24.773736 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.773780 kubelet[2921]: E0116 18:00:24.773744 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.773890 kubelet[2921]: E0116 18:00:24.773880 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.773927 kubelet[2921]: W0116 18:00:24.773890 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.773927 kubelet[2921]: E0116 18:00:24.773898 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.774015 kubelet[2921]: E0116 18:00:24.774006 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.774015 kubelet[2921]: W0116 18:00:24.774015 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.774073 kubelet[2921]: E0116 18:00:24.774022 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.774146 kubelet[2921]: E0116 18:00:24.774137 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.774188 kubelet[2921]: W0116 18:00:24.774146 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.774188 kubelet[2921]: E0116 18:00:24.774154 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.774275 kubelet[2921]: E0116 18:00:24.774265 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.774275 kubelet[2921]: W0116 18:00:24.774274 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.774323 kubelet[2921]: E0116 18:00:24.774282 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.774467 kubelet[2921]: E0116 18:00:24.774457 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.774508 kubelet[2921]: W0116 18:00:24.774468 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.774508 kubelet[2921]: E0116 18:00:24.774476 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.774596 kubelet[2921]: E0116 18:00:24.774587 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.774596 kubelet[2921]: W0116 18:00:24.774596 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.774647 kubelet[2921]: E0116 18:00:24.774604 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.774724 kubelet[2921]: E0116 18:00:24.774715 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.774724 kubelet[2921]: W0116 18:00:24.774723 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.774778 kubelet[2921]: E0116 18:00:24.774731 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.774859 kubelet[2921]: E0116 18:00:24.774848 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.774893 kubelet[2921]: W0116 18:00:24.774882 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.774921 kubelet[2921]: E0116 18:00:24.774894 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.775044 kubelet[2921]: E0116 18:00:24.775034 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.775078 kubelet[2921]: W0116 18:00:24.775044 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.775078 kubelet[2921]: E0116 18:00:24.775052 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.775178 kubelet[2921]: E0116 18:00:24.775168 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.775204 kubelet[2921]: W0116 18:00:24.775178 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.775204 kubelet[2921]: E0116 18:00:24.775185 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.775308 kubelet[2921]: E0116 18:00:24.775298 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.775337 kubelet[2921]: W0116 18:00:24.775309 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.775337 kubelet[2921]: E0116 18:00:24.775317 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.786447 kubelet[2921]: E0116 18:00:24.786271 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.786447 kubelet[2921]: W0116 18:00:24.786329 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.786447 kubelet[2921]: E0116 18:00:24.786346 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.786586 kubelet[2921]: E0116 18:00:24.786565 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.786586 kubelet[2921]: W0116 18:00:24.786583 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.786639 kubelet[2921]: E0116 18:00:24.786599 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.786791 kubelet[2921]: E0116 18:00:24.786775 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.786791 kubelet[2921]: W0116 18:00:24.786789 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.786846 kubelet[2921]: E0116 18:00:24.786798 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.786995 kubelet[2921]: E0116 18:00:24.786981 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.786995 kubelet[2921]: W0116 18:00:24.786994 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.787094 kubelet[2921]: E0116 18:00:24.787004 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.787151 kubelet[2921]: E0116 18:00:24.787137 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.787151 kubelet[2921]: W0116 18:00:24.787149 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.787196 kubelet[2921]: E0116 18:00:24.787157 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.787278 kubelet[2921]: E0116 18:00:24.787267 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.787278 kubelet[2921]: W0116 18:00:24.787276 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.787326 kubelet[2921]: E0116 18:00:24.787287 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.787485 kubelet[2921]: E0116 18:00:24.787471 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.787485 kubelet[2921]: W0116 18:00:24.787484 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.787543 kubelet[2921]: E0116 18:00:24.787493 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.787830 kubelet[2921]: E0116 18:00:24.787815 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.787859 kubelet[2921]: W0116 18:00:24.787831 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.787859 kubelet[2921]: E0116 18:00:24.787841 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.788020 kubelet[2921]: E0116 18:00:24.788006 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.788020 kubelet[2921]: W0116 18:00:24.788019 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.788071 kubelet[2921]: E0116 18:00:24.788028 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.788177 kubelet[2921]: E0116 18:00:24.788163 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.788204 kubelet[2921]: W0116 18:00:24.788176 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.788204 kubelet[2921]: E0116 18:00:24.788190 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.788310 kubelet[2921]: E0116 18:00:24.788299 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.788335 kubelet[2921]: W0116 18:00:24.788309 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.788335 kubelet[2921]: E0116 18:00:24.788320 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.788480 kubelet[2921]: E0116 18:00:24.788462 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.788480 kubelet[2921]: W0116 18:00:24.788480 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.788539 kubelet[2921]: E0116 18:00:24.788489 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.788803 kubelet[2921]: E0116 18:00:24.788788 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.788803 kubelet[2921]: W0116 18:00:24.788803 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.788857 kubelet[2921]: E0116 18:00:24.788814 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.788983 kubelet[2921]: E0116 18:00:24.788970 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.788983 kubelet[2921]: W0116 18:00:24.788982 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.789035 kubelet[2921]: E0116 18:00:24.788991 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.789141 kubelet[2921]: E0116 18:00:24.789122 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.789141 kubelet[2921]: W0116 18:00:24.789140 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.789182 kubelet[2921]: E0116 18:00:24.789148 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.789271 kubelet[2921]: E0116 18:00:24.789260 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.789271 kubelet[2921]: W0116 18:00:24.789270 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.789319 kubelet[2921]: E0116 18:00:24.789278 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.789442 kubelet[2921]: E0116 18:00:24.789418 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.789471 kubelet[2921]: W0116 18:00:24.789451 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.789471 kubelet[2921]: E0116 18:00:24.789463 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:24.790814 kubelet[2921]: E0116 18:00:24.790792 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:24.790814 kubelet[2921]: W0116 18:00:24.790815 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:24.791144 kubelet[2921]: E0116 18:00:24.791113 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.781967 kubelet[2921]: E0116 18:00:25.781842 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.781967 kubelet[2921]: W0116 18:00:25.781867 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.781967 kubelet[2921]: E0116 18:00:25.781888 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.782639 kubelet[2921]: E0116 18:00:25.782494 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.782639 kubelet[2921]: W0116 18:00:25.782508 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.782639 kubelet[2921]: E0116 18:00:25.782550 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.782825 kubelet[2921]: E0116 18:00:25.782810 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.782881 kubelet[2921]: W0116 18:00:25.782870 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.782938 kubelet[2921]: E0116 18:00:25.782927 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.783276 kubelet[2921]: E0116 18:00:25.783165 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.783276 kubelet[2921]: W0116 18:00:25.783178 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.783276 kubelet[2921]: E0116 18:00:25.783188 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.783558 kubelet[2921]: E0116 18:00:25.783457 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.783558 kubelet[2921]: W0116 18:00:25.783472 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.783558 kubelet[2921]: E0116 18:00:25.783482 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.783719 kubelet[2921]: E0116 18:00:25.783706 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.783782 kubelet[2921]: W0116 18:00:25.783769 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.783834 kubelet[2921]: E0116 18:00:25.783823 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.784114 kubelet[2921]: E0116 18:00:25.784022 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.784114 kubelet[2921]: W0116 18:00:25.784034 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.784114 kubelet[2921]: E0116 18:00:25.784043 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.784272 kubelet[2921]: E0116 18:00:25.784260 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.784331 kubelet[2921]: W0116 18:00:25.784320 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.784385 kubelet[2921]: E0116 18:00:25.784375 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.784706 kubelet[2921]: E0116 18:00:25.784597 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.784706 kubelet[2921]: W0116 18:00:25.784608 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.784706 kubelet[2921]: E0116 18:00:25.784618 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.784873 kubelet[2921]: E0116 18:00:25.784861 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.784933 kubelet[2921]: W0116 18:00:25.784921 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.785062 kubelet[2921]: E0116 18:00:25.784976 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.785170 kubelet[2921]: E0116 18:00:25.785158 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.785306 kubelet[2921]: W0116 18:00:25.785215 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.785306 kubelet[2921]: E0116 18:00:25.785230 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.785454 kubelet[2921]: E0116 18:00:25.785441 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.785615 kubelet[2921]: W0116 18:00:25.785509 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.785615 kubelet[2921]: E0116 18:00:25.785526 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.785835 kubelet[2921]: E0116 18:00:25.785737 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.785835 kubelet[2921]: W0116 18:00:25.785750 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.785835 kubelet[2921]: E0116 18:00:25.785760 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.785987 kubelet[2921]: E0116 18:00:25.785975 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.786047 kubelet[2921]: W0116 18:00:25.786036 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.786098 kubelet[2921]: E0116 18:00:25.786088 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.786345 kubelet[2921]: E0116 18:00:25.786275 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.786345 kubelet[2921]: W0116 18:00:25.786286 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.786345 kubelet[2921]: E0116 18:00:25.786294 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.793859 kubelet[2921]: E0116 18:00:25.793821 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.794014 kubelet[2921]: W0116 18:00:25.793861 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.794014 kubelet[2921]: E0116 18:00:25.793896 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.794241 kubelet[2921]: E0116 18:00:25.794229 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.794272 kubelet[2921]: W0116 18:00:25.794244 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.794272 kubelet[2921]: E0116 18:00:25.794260 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.794455 kubelet[2921]: E0116 18:00:25.794437 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.794498 kubelet[2921]: W0116 18:00:25.794455 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.794498 kubelet[2921]: E0116 18:00:25.794472 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.794618 kubelet[2921]: E0116 18:00:25.794609 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.794644 kubelet[2921]: W0116 18:00:25.794618 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.794644 kubelet[2921]: E0116 18:00:25.794632 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.794791 kubelet[2921]: E0116 18:00:25.794777 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.794791 kubelet[2921]: W0116 18:00:25.794786 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.794849 kubelet[2921]: E0116 18:00:25.794795 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.794970 kubelet[2921]: E0116 18:00:25.794957 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.794970 kubelet[2921]: W0116 18:00:25.794969 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.795027 kubelet[2921]: E0116 18:00:25.794983 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.795315 kubelet[2921]: E0116 18:00:25.795228 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.795315 kubelet[2921]: W0116 18:00:25.795243 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.795315 kubelet[2921]: E0116 18:00:25.795262 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.795465 kubelet[2921]: E0116 18:00:25.795418 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.795465 kubelet[2921]: W0116 18:00:25.795444 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.795465 kubelet[2921]: E0116 18:00:25.795459 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.795603 kubelet[2921]: E0116 18:00:25.795591 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.795603 kubelet[2921]: W0116 18:00:25.795602 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.795650 kubelet[2921]: E0116 18:00:25.795613 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.795743 kubelet[2921]: E0116 18:00:25.795732 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.795743 kubelet[2921]: W0116 18:00:25.795742 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.795787 kubelet[2921]: E0116 18:00:25.795753 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.795924 kubelet[2921]: E0116 18:00:25.795893 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.795962 kubelet[2921]: W0116 18:00:25.795924 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.795962 kubelet[2921]: E0116 18:00:25.795938 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.796163 kubelet[2921]: E0116 18:00:25.796127 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.796163 kubelet[2921]: W0116 18:00:25.796143 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.796163 kubelet[2921]: E0116 18:00:25.796155 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.796320 kubelet[2921]: E0116 18:00:25.796307 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.796360 kubelet[2921]: W0116 18:00:25.796320 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.796360 kubelet[2921]: E0116 18:00:25.796335 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.796485 kubelet[2921]: E0116 18:00:25.796473 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.796518 kubelet[2921]: W0116 18:00:25.796485 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.796518 kubelet[2921]: E0116 18:00:25.796493 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.796651 kubelet[2921]: E0116 18:00:25.796637 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.796651 kubelet[2921]: W0116 18:00:25.796649 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.796704 kubelet[2921]: E0116 18:00:25.796658 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.796792 kubelet[2921]: E0116 18:00:25.796781 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.796820 kubelet[2921]: W0116 18:00:25.796792 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.796820 kubelet[2921]: E0116 18:00:25.796800 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.796950 kubelet[2921]: E0116 18:00:25.796939 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.796979 kubelet[2921]: W0116 18:00:25.796950 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.796979 kubelet[2921]: E0116 18:00:25.796957 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:25.797251 kubelet[2921]: E0116 18:00:25.797225 2921 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 18:00:25.797251 kubelet[2921]: W0116 18:00:25.797237 2921 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 18:00:25.797251 kubelet[2921]: E0116 18:00:25.797245 2921 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 18:00:26.600609 kubelet[2921]: E0116 18:00:26.600558 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:00:28.601991 kubelet[2921]: E0116 18:00:28.601920 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:00:30.508185 containerd[1658]: time="2026-01-16T18:00:30.508118998Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 18:00:30.510713 containerd[1658]: time="2026-01-16T18:00:30.510634325Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:30.512336 containerd[1658]: time="2026-01-16T18:00:30.512283970Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 18:00:30.516349 containerd[1658]: time="2026-01-16T18:00:30.516224142Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 18:00:30.517006 containerd[1658]: time="2026-01-16T18:00:30.516979824Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 7.510566635s" Jan 16 18:00:30.517071 containerd[1658]: time="2026-01-16T18:00:30.517011785Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Jan 16 18:00:30.519584 containerd[1658]: time="2026-01-16T18:00:30.519537192Z" level=info msg="CreateContainer within sandbox \"e35fa994c2de84496e11e800d9ce4bb292ad3f89c1b81197964a8d077fe670db\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 16 18:00:30.533453 containerd[1658]: time="2026-01-16T18:00:30.533077153Z" level=info msg="Container 814020eb22f28759a8dbb5d7304b30e272817254faae37fc074ddec4d02c1e81: CDI devices from CRI Config.CDIDevices: []" Jan 16 18:00:30.544284 containerd[1658]: time="2026-01-16T18:00:30.544229547Z" level=info msg="CreateContainer within sandbox \"e35fa994c2de84496e11e800d9ce4bb292ad3f89c1b81197964a8d077fe670db\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"814020eb22f28759a8dbb5d7304b30e272817254faae37fc074ddec4d02c1e81\"" Jan 16 18:00:30.545369 containerd[1658]: time="2026-01-16T18:00:30.545249630Z" level=info msg="StartContainer for \"814020eb22f28759a8dbb5d7304b30e272817254faae37fc074ddec4d02c1e81\"" Jan 16 18:00:30.547434 containerd[1658]: time="2026-01-16T18:00:30.547395277Z" level=info msg="connecting to shim 814020eb22f28759a8dbb5d7304b30e272817254faae37fc074ddec4d02c1e81" address="unix:///run/containerd/s/205604bf089d31f7dc5cbcd5aa81bf33f6935ed1b77a0324ad25f9bcb9053459" protocol=ttrpc version=3 Jan 16 18:00:30.572662 systemd[1]: Started cri-containerd-814020eb22f28759a8dbb5d7304b30e272817254faae37fc074ddec4d02c1e81.scope - libcontainer container 814020eb22f28759a8dbb5d7304b30e272817254faae37fc074ddec4d02c1e81. Jan 16 18:00:30.601205 kubelet[2921]: E0116 18:00:30.601147 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:00:30.633000 audit: BPF prog-id=166 op=LOAD Jan 16 18:00:30.634702 kernel: kauditd_printk_skb: 18 callbacks suppressed Jan 16 18:00:30.634748 kernel: audit: type=1334 audit(1768586430.633:569): prog-id=166 op=LOAD Jan 16 18:00:30.633000 audit[3663]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3437 pid=3663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:30.638978 kernel: audit: type=1300 audit(1768586430.633:569): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3437 pid=3663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:30.639129 kernel: audit: type=1327 audit(1768586430.633:569): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831343032306562323266323837353961386462623564373330346233 Jan 16 18:00:30.633000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831343032306562323266323837353961386462623564373330346233 Jan 16 18:00:30.635000 audit: BPF prog-id=167 op=LOAD Jan 16 18:00:30.643449 kernel: audit: type=1334 audit(1768586430.635:570): prog-id=167 op=LOAD Jan 16 18:00:30.643484 kernel: audit: type=1300 audit(1768586430.635:570): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3437 pid=3663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:30.635000 audit[3663]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3437 pid=3663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:30.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831343032306562323266323837353961386462623564373330346233 Jan 16 18:00:30.650048 kernel: audit: type=1327 audit(1768586430.635:570): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831343032306562323266323837353961386462623564373330346233 Jan 16 18:00:30.638000 audit: BPF prog-id=167 op=UNLOAD Jan 16 18:00:30.638000 audit[3663]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:30.654543 kernel: audit: type=1334 audit(1768586430.638:571): prog-id=167 op=UNLOAD Jan 16 18:00:30.654606 kernel: audit: type=1300 audit(1768586430.638:571): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:30.654635 kernel: audit: type=1327 audit(1768586430.638:571): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831343032306562323266323837353961386462623564373330346233 Jan 16 18:00:30.638000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831343032306562323266323837353961386462623564373330346233 Jan 16 18:00:30.638000 audit: BPF prog-id=166 op=UNLOAD Jan 16 18:00:30.659048 kernel: audit: type=1334 audit(1768586430.638:572): prog-id=166 op=UNLOAD Jan 16 18:00:30.638000 audit[3663]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:30.638000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831343032306562323266323837353961386462623564373330346233 Jan 16 18:00:30.638000 audit: BPF prog-id=168 op=LOAD Jan 16 18:00:30.638000 audit[3663]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3437 pid=3663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:30.638000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3831343032306562323266323837353961386462623564373330346233 Jan 16 18:00:30.672372 containerd[1658]: time="2026-01-16T18:00:30.672310135Z" level=info msg="StartContainer for \"814020eb22f28759a8dbb5d7304b30e272817254faae37fc074ddec4d02c1e81\" returns successfully" Jan 16 18:00:30.685314 systemd[1]: cri-containerd-814020eb22f28759a8dbb5d7304b30e272817254faae37fc074ddec4d02c1e81.scope: Deactivated successfully. Jan 16 18:00:30.688704 containerd[1658]: time="2026-01-16T18:00:30.688576144Z" level=info msg="received container exit event container_id:\"814020eb22f28759a8dbb5d7304b30e272817254faae37fc074ddec4d02c1e81\" id:\"814020eb22f28759a8dbb5d7304b30e272817254faae37fc074ddec4d02c1e81\" pid:3676 exited_at:{seconds:1768586430 nanos:688037663}" Jan 16 18:00:30.689000 audit: BPF prog-id=168 op=UNLOAD Jan 16 18:00:30.714528 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-814020eb22f28759a8dbb5d7304b30e272817254faae37fc074ddec4d02c1e81-rootfs.mount: Deactivated successfully. Jan 16 18:00:31.698889 containerd[1658]: time="2026-01-16T18:00:31.698568484Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 16 18:00:32.601468 kubelet[2921]: E0116 18:00:32.601185 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:00:34.259012 containerd[1658]: time="2026-01-16T18:00:34.258957162Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 18:00:34.260220 containerd[1658]: time="2026-01-16T18:00:34.260178686Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65921248" Jan 16 18:00:34.261306 containerd[1658]: time="2026-01-16T18:00:34.261250609Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 18:00:34.263480 containerd[1658]: time="2026-01-16T18:00:34.263446216Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 18:00:34.264329 containerd[1658]: time="2026-01-16T18:00:34.264280378Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 2.565673093s" Jan 16 18:00:34.264329 containerd[1658]: time="2026-01-16T18:00:34.264313618Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Jan 16 18:00:34.266495 containerd[1658]: time="2026-01-16T18:00:34.266467145Z" level=info msg="CreateContainer within sandbox \"e35fa994c2de84496e11e800d9ce4bb292ad3f89c1b81197964a8d077fe670db\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 16 18:00:34.276644 containerd[1658]: time="2026-01-16T18:00:34.276599655Z" level=info msg="Container 513a4d8fa376bad0c685b0564f94c05c7b41f1c318d3f80b30c64bc4370b1d6e: CDI devices from CRI Config.CDIDevices: []" Jan 16 18:00:34.286243 containerd[1658]: time="2026-01-16T18:00:34.286181604Z" level=info msg="CreateContainer within sandbox \"e35fa994c2de84496e11e800d9ce4bb292ad3f89c1b81197964a8d077fe670db\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"513a4d8fa376bad0c685b0564f94c05c7b41f1c318d3f80b30c64bc4370b1d6e\"" Jan 16 18:00:34.286867 containerd[1658]: time="2026-01-16T18:00:34.286578806Z" level=info msg="StartContainer for \"513a4d8fa376bad0c685b0564f94c05c7b41f1c318d3f80b30c64bc4370b1d6e\"" Jan 16 18:00:34.288506 containerd[1658]: time="2026-01-16T18:00:34.288479371Z" level=info msg="connecting to shim 513a4d8fa376bad0c685b0564f94c05c7b41f1c318d3f80b30c64bc4370b1d6e" address="unix:///run/containerd/s/205604bf089d31f7dc5cbcd5aa81bf33f6935ed1b77a0324ad25f9bcb9053459" protocol=ttrpc version=3 Jan 16 18:00:34.312657 systemd[1]: Started cri-containerd-513a4d8fa376bad0c685b0564f94c05c7b41f1c318d3f80b30c64bc4370b1d6e.scope - libcontainer container 513a4d8fa376bad0c685b0564f94c05c7b41f1c318d3f80b30c64bc4370b1d6e. Jan 16 18:00:34.350000 audit: BPF prog-id=169 op=LOAD Jan 16 18:00:34.350000 audit[3727]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3437 pid=3727 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:34.350000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3531336134643866613337366261643063363835623035363466393463 Jan 16 18:00:34.350000 audit: BPF prog-id=170 op=LOAD Jan 16 18:00:34.350000 audit[3727]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3437 pid=3727 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:34.350000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3531336134643866613337366261643063363835623035363466393463 Jan 16 18:00:34.351000 audit: BPF prog-id=170 op=UNLOAD Jan 16 18:00:34.351000 audit[3727]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3727 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:34.351000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3531336134643866613337366261643063363835623035363466393463 Jan 16 18:00:34.351000 audit: BPF prog-id=169 op=UNLOAD Jan 16 18:00:34.351000 audit[3727]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=3727 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:34.351000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3531336134643866613337366261643063363835623035363466393463 Jan 16 18:00:34.351000 audit: BPF prog-id=171 op=LOAD Jan 16 18:00:34.351000 audit[3727]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3437 pid=3727 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:34.351000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3531336134643866613337366261643063363835623035363466393463 Jan 16 18:00:34.368233 containerd[1658]: time="2026-01-16T18:00:34.368195893Z" level=info msg="StartContainer for \"513a4d8fa376bad0c685b0564f94c05c7b41f1c318d3f80b30c64bc4370b1d6e\" returns successfully" Jan 16 18:00:34.601653 kubelet[2921]: E0116 18:00:34.601225 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:00:34.760513 systemd[1]: cri-containerd-513a4d8fa376bad0c685b0564f94c05c7b41f1c318d3f80b30c64bc4370b1d6e.scope: Deactivated successfully. Jan 16 18:00:34.760816 systemd[1]: cri-containerd-513a4d8fa376bad0c685b0564f94c05c7b41f1c318d3f80b30c64bc4370b1d6e.scope: Consumed 463ms CPU time, 194.6M memory peak, 165.9M written to disk. Jan 16 18:00:34.761913 kubelet[2921]: I0116 18:00:34.761888 2921 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 16 18:00:34.763807 containerd[1658]: time="2026-01-16T18:00:34.763776091Z" level=info msg="received container exit event container_id:\"513a4d8fa376bad0c685b0564f94c05c7b41f1c318d3f80b30c64bc4370b1d6e\" id:\"513a4d8fa376bad0c685b0564f94c05c7b41f1c318d3f80b30c64bc4370b1d6e\" pid:3740 exited_at:{seconds:1768586434 nanos:763441330}" Jan 16 18:00:34.765000 audit: BPF prog-id=171 op=UNLOAD Jan 16 18:00:34.806127 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-513a4d8fa376bad0c685b0564f94c05c7b41f1c318d3f80b30c64bc4370b1d6e-rootfs.mount: Deactivated successfully. Jan 16 18:00:34.811471 systemd[1]: Created slice kubepods-burstable-pod2f2a1d19_8f6a_44c8_adb8_cc33e1ecebce.slice - libcontainer container kubepods-burstable-pod2f2a1d19_8f6a_44c8_adb8_cc33e1ecebce.slice. Jan 16 18:00:34.820001 systemd[1]: Created slice kubepods-burstable-podbe139626_cf50_4387_b6b7_3299ee34ffb2.slice - libcontainer container kubepods-burstable-podbe139626_cf50_4387_b6b7_3299ee34ffb2.slice. Jan 16 18:00:34.827547 systemd[1]: Created slice kubepods-besteffort-pod3860e0e2_bc4a_40cc_b58d_f0b04ee81f50.slice - libcontainer container kubepods-besteffort-pod3860e0e2_bc4a_40cc_b58d_f0b04ee81f50.slice. Jan 16 18:00:34.834858 systemd[1]: Created slice kubepods-besteffort-pod2d37882f_c058_4d23_87c0_40e1fbaf0de7.slice - libcontainer container kubepods-besteffort-pod2d37882f_c058_4d23_87c0_40e1fbaf0de7.slice. Jan 16 18:00:34.850733 systemd[1]: Created slice kubepods-besteffort-podd1e9d740_842b_4bc4_a9fc_8b41a7b03ee3.slice - libcontainer container kubepods-besteffort-podd1e9d740_842b_4bc4_a9fc_8b41a7b03ee3.slice. Jan 16 18:00:34.853290 kubelet[2921]: I0116 18:00:34.851498 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64da405a-12aa-44ec-8b4d-a44866f591ec-goldmane-ca-bundle\") pod \"goldmane-666569f655-j5dxx\" (UID: \"64da405a-12aa-44ec-8b4d-a44866f591ec\") " pod="calico-system/goldmane-666569f655-j5dxx" Jan 16 18:00:34.854804 kubelet[2921]: I0116 18:00:34.854769 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbbxs\" (UniqueName: \"kubernetes.io/projected/64da405a-12aa-44ec-8b4d-a44866f591ec-kube-api-access-mbbxs\") pod \"goldmane-666569f655-j5dxx\" (UID: \"64da405a-12aa-44ec-8b4d-a44866f591ec\") " pod="calico-system/goldmane-666569f655-j5dxx" Jan 16 18:00:34.854878 kubelet[2921]: I0116 18:00:34.854823 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkdsc\" (UniqueName: \"kubernetes.io/projected/2f2a1d19-8f6a-44c8-adb8-cc33e1ecebce-kube-api-access-bkdsc\") pod \"coredns-668d6bf9bc-t28mh\" (UID: \"2f2a1d19-8f6a-44c8-adb8-cc33e1ecebce\") " pod="kube-system/coredns-668d6bf9bc-t28mh" Jan 16 18:00:34.854878 kubelet[2921]: I0116 18:00:34.854847 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64da405a-12aa-44ec-8b4d-a44866f591ec-config\") pod \"goldmane-666569f655-j5dxx\" (UID: \"64da405a-12aa-44ec-8b4d-a44866f591ec\") " pod="calico-system/goldmane-666569f655-j5dxx" Jan 16 18:00:34.854878 kubelet[2921]: I0116 18:00:34.854871 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vnxx\" (UniqueName: \"kubernetes.io/projected/d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3-kube-api-access-5vnxx\") pod \"whisker-9db97867b-q8hz7\" (UID: \"d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3\") " pod="calico-system/whisker-9db97867b-q8hz7" Jan 16 18:00:34.854954 kubelet[2921]: I0116 18:00:34.854894 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84qv8\" (UniqueName: \"kubernetes.io/projected/cbb94aae-850a-4651-be6c-0c622d131a34-kube-api-access-84qv8\") pod \"calico-kube-controllers-7b6469dfdd-hzhpv\" (UID: \"cbb94aae-850a-4651-be6c-0c622d131a34\") " pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" Jan 16 18:00:34.854954 kubelet[2921]: I0116 18:00:34.854919 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3860e0e2-bc4a-40cc-b58d-f0b04ee81f50-calico-apiserver-certs\") pod \"calico-apiserver-5cdd49cc55-ql8tz\" (UID: \"3860e0e2-bc4a-40cc-b58d-f0b04ee81f50\") " pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" Jan 16 18:00:34.854954 kubelet[2921]: I0116 18:00:34.854946 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f2a1d19-8f6a-44c8-adb8-cc33e1ecebce-config-volume\") pod \"coredns-668d6bf9bc-t28mh\" (UID: \"2f2a1d19-8f6a-44c8-adb8-cc33e1ecebce\") " pod="kube-system/coredns-668d6bf9bc-t28mh" Jan 16 18:00:34.855018 kubelet[2921]: I0116 18:00:34.854967 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdp42\" (UniqueName: \"kubernetes.io/projected/2d37882f-c058-4d23-87c0-40e1fbaf0de7-kube-api-access-rdp42\") pod \"calico-apiserver-5cdd49cc55-ck2gc\" (UID: \"2d37882f-c058-4d23-87c0-40e1fbaf0de7\") " pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" Jan 16 18:00:34.855018 kubelet[2921]: I0116 18:00:34.854988 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbb94aae-850a-4651-be6c-0c622d131a34-tigera-ca-bundle\") pod \"calico-kube-controllers-7b6469dfdd-hzhpv\" (UID: \"cbb94aae-850a-4651-be6c-0c622d131a34\") " pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" Jan 16 18:00:34.855018 kubelet[2921]: I0116 18:00:34.855008 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3-whisker-backend-key-pair\") pod \"whisker-9db97867b-q8hz7\" (UID: \"d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3\") " pod="calico-system/whisker-9db97867b-q8hz7" Jan 16 18:00:34.855081 kubelet[2921]: I0116 18:00:34.855028 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3-whisker-ca-bundle\") pod \"whisker-9db97867b-q8hz7\" (UID: \"d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3\") " pod="calico-system/whisker-9db97867b-q8hz7" Jan 16 18:00:34.855081 kubelet[2921]: I0116 18:00:34.855053 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/2d37882f-c058-4d23-87c0-40e1fbaf0de7-calico-apiserver-certs\") pod \"calico-apiserver-5cdd49cc55-ck2gc\" (UID: \"2d37882f-c058-4d23-87c0-40e1fbaf0de7\") " pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" Jan 16 18:00:34.855081 kubelet[2921]: I0116 18:00:34.855071 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be139626-cf50-4387-b6b7-3299ee34ffb2-config-volume\") pod \"coredns-668d6bf9bc-5zv4w\" (UID: \"be139626-cf50-4387-b6b7-3299ee34ffb2\") " pod="kube-system/coredns-668d6bf9bc-5zv4w" Jan 16 18:00:34.855144 kubelet[2921]: I0116 18:00:34.855095 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp255\" (UniqueName: \"kubernetes.io/projected/be139626-cf50-4387-b6b7-3299ee34ffb2-kube-api-access-rp255\") pod \"coredns-668d6bf9bc-5zv4w\" (UID: \"be139626-cf50-4387-b6b7-3299ee34ffb2\") " pod="kube-system/coredns-668d6bf9bc-5zv4w" Jan 16 18:00:34.855144 kubelet[2921]: I0116 18:00:34.855125 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzkc8\" (UniqueName: \"kubernetes.io/projected/3860e0e2-bc4a-40cc-b58d-f0b04ee81f50-kube-api-access-hzkc8\") pod \"calico-apiserver-5cdd49cc55-ql8tz\" (UID: \"3860e0e2-bc4a-40cc-b58d-f0b04ee81f50\") " pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" Jan 16 18:00:34.855186 kubelet[2921]: I0116 18:00:34.855145 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/64da405a-12aa-44ec-8b4d-a44866f591ec-goldmane-key-pair\") pod \"goldmane-666569f655-j5dxx\" (UID: \"64da405a-12aa-44ec-8b4d-a44866f591ec\") " pod="calico-system/goldmane-666569f655-j5dxx" Jan 16 18:00:34.859151 systemd[1]: Created slice kubepods-besteffort-podcbb94aae_850a_4651_be6c_0c622d131a34.slice - libcontainer container kubepods-besteffort-podcbb94aae_850a_4651_be6c_0c622d131a34.slice. Jan 16 18:00:34.866634 systemd[1]: Created slice kubepods-besteffort-pod64da405a_12aa_44ec_8b4d_a44866f591ec.slice - libcontainer container kubepods-besteffort-pod64da405a_12aa_44ec_8b4d_a44866f591ec.slice. Jan 16 18:00:35.115939 containerd[1658]: time="2026-01-16T18:00:35.115837318Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-t28mh,Uid:2f2a1d19-8f6a-44c8-adb8-cc33e1ecebce,Namespace:kube-system,Attempt:0,}" Jan 16 18:00:35.134787 containerd[1658]: time="2026-01-16T18:00:35.134733055Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cdd49cc55-ql8tz,Uid:3860e0e2-bc4a-40cc-b58d-f0b04ee81f50,Namespace:calico-apiserver,Attempt:0,}" Jan 16 18:00:35.134956 containerd[1658]: time="2026-01-16T18:00:35.134733495Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-5zv4w,Uid:be139626-cf50-4387-b6b7-3299ee34ffb2,Namespace:kube-system,Attempt:0,}" Jan 16 18:00:35.142546 containerd[1658]: time="2026-01-16T18:00:35.142504559Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cdd49cc55-ck2gc,Uid:2d37882f-c058-4d23-87c0-40e1fbaf0de7,Namespace:calico-apiserver,Attempt:0,}" Jan 16 18:00:35.155606 containerd[1658]: time="2026-01-16T18:00:35.155552478Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-9db97867b-q8hz7,Uid:d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3,Namespace:calico-system,Attempt:0,}" Jan 16 18:00:35.168439 containerd[1658]: time="2026-01-16T18:00:35.168375797Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b6469dfdd-hzhpv,Uid:cbb94aae-850a-4651-be6c-0c622d131a34,Namespace:calico-system,Attempt:0,}" Jan 16 18:00:35.171241 containerd[1658]: time="2026-01-16T18:00:35.171213206Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-j5dxx,Uid:64da405a-12aa-44ec-8b4d-a44866f591ec,Namespace:calico-system,Attempt:0,}" Jan 16 18:00:35.212979 containerd[1658]: time="2026-01-16T18:00:35.212543291Z" level=error msg="Failed to destroy network for sandbox \"5b70c70b04997598437eada53da8901f7f97ce7ab4a0afb3054f7b12a8a4a43d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.219222 containerd[1658]: time="2026-01-16T18:00:35.219136751Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-t28mh,Uid:2f2a1d19-8f6a-44c8-adb8-cc33e1ecebce,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b70c70b04997598437eada53da8901f7f97ce7ab4a0afb3054f7b12a8a4a43d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.219646 kubelet[2921]: E0116 18:00:35.219594 2921 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b70c70b04997598437eada53da8901f7f97ce7ab4a0afb3054f7b12a8a4a43d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.219727 kubelet[2921]: E0116 18:00:35.219680 2921 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b70c70b04997598437eada53da8901f7f97ce7ab4a0afb3054f7b12a8a4a43d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-t28mh" Jan 16 18:00:35.219727 kubelet[2921]: E0116 18:00:35.219700 2921 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b70c70b04997598437eada53da8901f7f97ce7ab4a0afb3054f7b12a8a4a43d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-t28mh" Jan 16 18:00:35.219804 kubelet[2921]: E0116 18:00:35.219747 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-t28mh_kube-system(2f2a1d19-8f6a-44c8-adb8-cc33e1ecebce)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-t28mh_kube-system(2f2a1d19-8f6a-44c8-adb8-cc33e1ecebce)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5b70c70b04997598437eada53da8901f7f97ce7ab4a0afb3054f7b12a8a4a43d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-t28mh" podUID="2f2a1d19-8f6a-44c8-adb8-cc33e1ecebce" Jan 16 18:00:35.240359 containerd[1658]: time="2026-01-16T18:00:35.240312975Z" level=error msg="Failed to destroy network for sandbox \"0935e124aab87a1e683fb09df0ec6759ee64afd20d37cc1c22afa5ea739280b8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.246802 containerd[1658]: time="2026-01-16T18:00:35.246749395Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cdd49cc55-ql8tz,Uid:3860e0e2-bc4a-40cc-b58d-f0b04ee81f50,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0935e124aab87a1e683fb09df0ec6759ee64afd20d37cc1c22afa5ea739280b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.247579 kubelet[2921]: E0116 18:00:35.247173 2921 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0935e124aab87a1e683fb09df0ec6759ee64afd20d37cc1c22afa5ea739280b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.247579 kubelet[2921]: E0116 18:00:35.247233 2921 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0935e124aab87a1e683fb09df0ec6759ee64afd20d37cc1c22afa5ea739280b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" Jan 16 18:00:35.247579 kubelet[2921]: E0116 18:00:35.247252 2921 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0935e124aab87a1e683fb09df0ec6759ee64afd20d37cc1c22afa5ea739280b8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" Jan 16 18:00:35.247735 kubelet[2921]: E0116 18:00:35.247299 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5cdd49cc55-ql8tz_calico-apiserver(3860e0e2-bc4a-40cc-b58d-f0b04ee81f50)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5cdd49cc55-ql8tz_calico-apiserver(3860e0e2-bc4a-40cc-b58d-f0b04ee81f50)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0935e124aab87a1e683fb09df0ec6759ee64afd20d37cc1c22afa5ea739280b8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:00:35.256793 containerd[1658]: time="2026-01-16T18:00:35.256750385Z" level=error msg="Failed to destroy network for sandbox \"f4fa6364f38dcc162589721c8a80d085a0e4510c238154bce3773168119a8304\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.263484 containerd[1658]: time="2026-01-16T18:00:35.263436445Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cdd49cc55-ck2gc,Uid:2d37882f-c058-4d23-87c0-40e1fbaf0de7,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4fa6364f38dcc162589721c8a80d085a0e4510c238154bce3773168119a8304\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.264389 kubelet[2921]: E0116 18:00:35.264339 2921 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4fa6364f38dcc162589721c8a80d085a0e4510c238154bce3773168119a8304\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.264518 kubelet[2921]: E0116 18:00:35.264401 2921 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4fa6364f38dcc162589721c8a80d085a0e4510c238154bce3773168119a8304\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" Jan 16 18:00:35.264645 kubelet[2921]: E0116 18:00:35.264564 2921 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4fa6364f38dcc162589721c8a80d085a0e4510c238154bce3773168119a8304\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" Jan 16 18:00:35.265653 kubelet[2921]: E0116 18:00:35.264639 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5cdd49cc55-ck2gc_calico-apiserver(2d37882f-c058-4d23-87c0-40e1fbaf0de7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5cdd49cc55-ck2gc_calico-apiserver(2d37882f-c058-4d23-87c0-40e1fbaf0de7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f4fa6364f38dcc162589721c8a80d085a0e4510c238154bce3773168119a8304\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:00:35.281810 containerd[1658]: time="2026-01-16T18:00:35.281768061Z" level=error msg="Failed to destroy network for sandbox \"c2a7ffad31287fff386d69bfe272c3e7a8c44d645b53888a3bb53e6c056c98de\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.282846 containerd[1658]: time="2026-01-16T18:00:35.282778104Z" level=error msg="Failed to destroy network for sandbox \"5e4658ac3b4821976aef63972142df50f08438acfe6202f826590f71e9b81049\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.290515 containerd[1658]: time="2026-01-16T18:00:35.290317167Z" level=error msg="Failed to destroy network for sandbox \"13b58dd790703f6e936ba1dbe1ef8863755bef654c3b6a6660d43bd534aa3ace\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.291400 systemd[1]: run-netns-cni\x2da0d9edb4\x2d8d37\x2d8299\x2df41c\x2dcc7862511de2.mount: Deactivated successfully. Jan 16 18:00:35.291506 systemd[1]: run-netns-cni\x2d107b84c5\x2da8a9\x2d585a\x2d9030\x2d6ad25093d5fd.mount: Deactivated successfully. Jan 16 18:00:35.293616 systemd[1]: run-netns-cni\x2d213ac776\x2dc7e6\x2d5942\x2d7bd6\x2d14385d83c17f.mount: Deactivated successfully. Jan 16 18:00:35.297820 containerd[1658]: time="2026-01-16T18:00:35.297763709Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-5zv4w,Uid:be139626-cf50-4387-b6b7-3299ee34ffb2,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2a7ffad31287fff386d69bfe272c3e7a8c44d645b53888a3bb53e6c056c98de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.298092 containerd[1658]: time="2026-01-16T18:00:35.298065430Z" level=error msg="Failed to destroy network for sandbox \"0b2021a517411ae0181ac7a19c28df4c6d0777345434406e73dd3c1373d467ee\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.298139 kubelet[2921]: E0116 18:00:35.298087 2921 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2a7ffad31287fff386d69bfe272c3e7a8c44d645b53888a3bb53e6c056c98de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.298174 kubelet[2921]: E0116 18:00:35.298139 2921 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2a7ffad31287fff386d69bfe272c3e7a8c44d645b53888a3bb53e6c056c98de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-5zv4w" Jan 16 18:00:35.298174 kubelet[2921]: E0116 18:00:35.298158 2921 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2a7ffad31287fff386d69bfe272c3e7a8c44d645b53888a3bb53e6c056c98de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-5zv4w" Jan 16 18:00:35.298229 kubelet[2921]: E0116 18:00:35.298208 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-5zv4w_kube-system(be139626-cf50-4387-b6b7-3299ee34ffb2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-5zv4w_kube-system(be139626-cf50-4387-b6b7-3299ee34ffb2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c2a7ffad31287fff386d69bfe272c3e7a8c44d645b53888a3bb53e6c056c98de\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-5zv4w" podUID="be139626-cf50-4387-b6b7-3299ee34ffb2" Jan 16 18:00:35.299926 systemd[1]: run-netns-cni\x2dfa9d263a\x2d9ad9\x2decd7\x2dc20e\x2dc8e3d30635df.mount: Deactivated successfully. Jan 16 18:00:35.305858 containerd[1658]: time="2026-01-16T18:00:35.305800094Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-9db97867b-q8hz7,Uid:d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"13b58dd790703f6e936ba1dbe1ef8863755bef654c3b6a6660d43bd534aa3ace\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.306137 containerd[1658]: time="2026-01-16T18:00:35.306066854Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b6469dfdd-hzhpv,Uid:cbb94aae-850a-4651-be6c-0c622d131a34,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e4658ac3b4821976aef63972142df50f08438acfe6202f826590f71e9b81049\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.307132 kubelet[2921]: E0116 18:00:35.306816 2921 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13b58dd790703f6e936ba1dbe1ef8863755bef654c3b6a6660d43bd534aa3ace\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.307132 kubelet[2921]: E0116 18:00:35.306874 2921 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13b58dd790703f6e936ba1dbe1ef8863755bef654c3b6a6660d43bd534aa3ace\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-9db97867b-q8hz7" Jan 16 18:00:35.307132 kubelet[2921]: E0116 18:00:35.306892 2921 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13b58dd790703f6e936ba1dbe1ef8863755bef654c3b6a6660d43bd534aa3ace\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-9db97867b-q8hz7" Jan 16 18:00:35.307240 kubelet[2921]: E0116 18:00:35.306932 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-9db97867b-q8hz7_calico-system(d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-9db97867b-q8hz7_calico-system(d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"13b58dd790703f6e936ba1dbe1ef8863755bef654c3b6a6660d43bd534aa3ace\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-9db97867b-q8hz7" podUID="d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3" Jan 16 18:00:35.307677 kubelet[2921]: E0116 18:00:35.307650 2921 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e4658ac3b4821976aef63972142df50f08438acfe6202f826590f71e9b81049\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.307824 kubelet[2921]: E0116 18:00:35.307807 2921 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e4658ac3b4821976aef63972142df50f08438acfe6202f826590f71e9b81049\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" Jan 16 18:00:35.307904 kubelet[2921]: E0116 18:00:35.307889 2921 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e4658ac3b4821976aef63972142df50f08438acfe6202f826590f71e9b81049\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" Jan 16 18:00:35.308015 kubelet[2921]: E0116 18:00:35.307990 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7b6469dfdd-hzhpv_calico-system(cbb94aae-850a-4651-be6c-0c622d131a34)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7b6469dfdd-hzhpv_calico-system(cbb94aae-850a-4651-be6c-0c622d131a34)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5e4658ac3b4821976aef63972142df50f08438acfe6202f826590f71e9b81049\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:00:35.311307 containerd[1658]: time="2026-01-16T18:00:35.311259750Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-j5dxx,Uid:64da405a-12aa-44ec-8b4d-a44866f591ec,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b2021a517411ae0181ac7a19c28df4c6d0777345434406e73dd3c1373d467ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.311474 kubelet[2921]: E0116 18:00:35.311455 2921 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b2021a517411ae0181ac7a19c28df4c6d0777345434406e73dd3c1373d467ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:35.311536 kubelet[2921]: E0116 18:00:35.311489 2921 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b2021a517411ae0181ac7a19c28df4c6d0777345434406e73dd3c1373d467ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-j5dxx" Jan 16 18:00:35.311536 kubelet[2921]: E0116 18:00:35.311506 2921 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b2021a517411ae0181ac7a19c28df4c6d0777345434406e73dd3c1373d467ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-j5dxx" Jan 16 18:00:35.311617 kubelet[2921]: E0116 18:00:35.311531 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-j5dxx_calico-system(64da405a-12aa-44ec-8b4d-a44866f591ec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-j5dxx_calico-system(64da405a-12aa-44ec-8b4d-a44866f591ec)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0b2021a517411ae0181ac7a19c28df4c6d0777345434406e73dd3c1373d467ee\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:00:35.712300 containerd[1658]: time="2026-01-16T18:00:35.712237645Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 16 18:00:36.606974 systemd[1]: Created slice kubepods-besteffort-pod08cc5ff3_c204_4b6d_8d83_d5c9e19e6ce3.slice - libcontainer container kubepods-besteffort-pod08cc5ff3_c204_4b6d_8d83_d5c9e19e6ce3.slice. Jan 16 18:00:36.609347 containerd[1658]: time="2026-01-16T18:00:36.609316323Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rb578,Uid:08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3,Namespace:calico-system,Attempt:0,}" Jan 16 18:00:36.659505 containerd[1658]: time="2026-01-16T18:00:36.659449715Z" level=error msg="Failed to destroy network for sandbox \"3d72fa56d46f66577c7621a177075b22aca4be78492c434a6b06e033d80bd0eb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:36.661245 systemd[1]: run-netns-cni\x2de5c282b1\x2dbdc9\x2dfd3f\x2dff05\x2d30cbe41a5af2.mount: Deactivated successfully. Jan 16 18:00:36.669922 containerd[1658]: time="2026-01-16T18:00:36.669862867Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rb578,Uid:08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d72fa56d46f66577c7621a177075b22aca4be78492c434a6b06e033d80bd0eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:36.670114 kubelet[2921]: E0116 18:00:36.670073 2921 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d72fa56d46f66577c7621a177075b22aca4be78492c434a6b06e033d80bd0eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 18:00:36.670367 kubelet[2921]: E0116 18:00:36.670130 2921 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d72fa56d46f66577c7621a177075b22aca4be78492c434a6b06e033d80bd0eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rb578" Jan 16 18:00:36.670367 kubelet[2921]: E0116 18:00:36.670151 2921 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d72fa56d46f66577c7621a177075b22aca4be78492c434a6b06e033d80bd0eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rb578" Jan 16 18:00:36.670367 kubelet[2921]: E0116 18:00:36.670190 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-rb578_calico-system(08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-rb578_calico-system(08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3d72fa56d46f66577c7621a177075b22aca4be78492c434a6b06e033d80bd0eb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:00:39.170692 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3355676018.mount: Deactivated successfully. Jan 16 18:00:39.193773 containerd[1658]: time="2026-01-16T18:00:39.193651393Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 18:00:39.194855 containerd[1658]: time="2026-01-16T18:00:39.194800077Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150930912" Jan 16 18:00:39.196870 containerd[1658]: time="2026-01-16T18:00:39.196819363Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 18:00:39.199097 containerd[1658]: time="2026-01-16T18:00:39.199031209Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 18:00:39.199639 containerd[1658]: time="2026-01-16T18:00:39.199615491Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 3.487312166s" Jan 16 18:00:39.199683 containerd[1658]: time="2026-01-16T18:00:39.199643051Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Jan 16 18:00:39.210266 containerd[1658]: time="2026-01-16T18:00:39.210217723Z" level=info msg="CreateContainer within sandbox \"e35fa994c2de84496e11e800d9ce4bb292ad3f89c1b81197964a8d077fe670db\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 16 18:00:39.257506 containerd[1658]: time="2026-01-16T18:00:39.256519304Z" level=info msg="Container 54fd15b9145243644143b04ad8c81d86369309ad7687635351dfe263f7d933aa: CDI devices from CRI Config.CDIDevices: []" Jan 16 18:00:39.277070 containerd[1658]: time="2026-01-16T18:00:39.277013086Z" level=info msg="CreateContainer within sandbox \"e35fa994c2de84496e11e800d9ce4bb292ad3f89c1b81197964a8d077fe670db\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"54fd15b9145243644143b04ad8c81d86369309ad7687635351dfe263f7d933aa\"" Jan 16 18:00:39.278138 containerd[1658]: time="2026-01-16T18:00:39.278091329Z" level=info msg="StartContainer for \"54fd15b9145243644143b04ad8c81d86369309ad7687635351dfe263f7d933aa\"" Jan 16 18:00:39.279896 containerd[1658]: time="2026-01-16T18:00:39.279861174Z" level=info msg="connecting to shim 54fd15b9145243644143b04ad8c81d86369309ad7687635351dfe263f7d933aa" address="unix:///run/containerd/s/205604bf089d31f7dc5cbcd5aa81bf33f6935ed1b77a0324ad25f9bcb9053459" protocol=ttrpc version=3 Jan 16 18:00:39.303601 systemd[1]: Started cri-containerd-54fd15b9145243644143b04ad8c81d86369309ad7687635351dfe263f7d933aa.scope - libcontainer container 54fd15b9145243644143b04ad8c81d86369309ad7687635351dfe263f7d933aa. Jan 16 18:00:39.376000 audit: BPF prog-id=172 op=LOAD Jan 16 18:00:39.377526 kernel: kauditd_printk_skb: 22 callbacks suppressed Jan 16 18:00:39.377580 kernel: audit: type=1334 audit(1768586439.376:581): prog-id=172 op=LOAD Jan 16 18:00:39.376000 audit[4057]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3437 pid=4057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:39.382053 kernel: audit: type=1300 audit(1768586439.376:581): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3437 pid=4057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:39.382111 kernel: audit: type=1327 audit(1768586439.376:581): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534666431356239313435323433363434313433623034616438633831 Jan 16 18:00:39.376000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534666431356239313435323433363434313433623034616438633831 Jan 16 18:00:39.385643 kernel: audit: type=1334 audit(1768586439.376:582): prog-id=173 op=LOAD Jan 16 18:00:39.376000 audit: BPF prog-id=173 op=LOAD Jan 16 18:00:39.376000 audit[4057]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3437 pid=4057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:39.389988 kernel: audit: type=1300 audit(1768586439.376:582): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3437 pid=4057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:39.390087 kernel: audit: type=1327 audit(1768586439.376:582): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534666431356239313435323433363434313433623034616438633831 Jan 16 18:00:39.376000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534666431356239313435323433363434313433623034616438633831 Jan 16 18:00:39.377000 audit: BPF prog-id=173 op=UNLOAD Jan 16 18:00:39.395221 kernel: audit: type=1334 audit(1768586439.377:583): prog-id=173 op=UNLOAD Jan 16 18:00:39.377000 audit[4057]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=4057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:39.398606 kernel: audit: type=1300 audit(1768586439.377:583): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=4057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:39.377000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534666431356239313435323433363434313433623034616438633831 Jan 16 18:00:39.402095 kernel: audit: type=1327 audit(1768586439.377:583): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534666431356239313435323433363434313433623034616438633831 Jan 16 18:00:39.402334 kernel: audit: type=1334 audit(1768586439.377:584): prog-id=172 op=UNLOAD Jan 16 18:00:39.377000 audit: BPF prog-id=172 op=UNLOAD Jan 16 18:00:39.377000 audit[4057]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3437 pid=4057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:39.377000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534666431356239313435323433363434313433623034616438633831 Jan 16 18:00:39.377000 audit: BPF prog-id=174 op=LOAD Jan 16 18:00:39.377000 audit[4057]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3437 pid=4057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:39.377000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3534666431356239313435323433363434313433623034616438633831 Jan 16 18:00:39.424514 containerd[1658]: time="2026-01-16T18:00:39.424396892Z" level=info msg="StartContainer for \"54fd15b9145243644143b04ad8c81d86369309ad7687635351dfe263f7d933aa\" returns successfully" Jan 16 18:00:39.571460 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 16 18:00:39.571558 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 16 18:00:39.791448 kubelet[2921]: I0116 18:00:39.791251 2921 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vnxx\" (UniqueName: \"kubernetes.io/projected/d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3-kube-api-access-5vnxx\") pod \"d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3\" (UID: \"d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3\") " Jan 16 18:00:39.792173 kubelet[2921]: I0116 18:00:39.791730 2921 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3-whisker-ca-bundle\") pod \"d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3\" (UID: \"d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3\") " Jan 16 18:00:39.792388 kubelet[2921]: I0116 18:00:39.792362 2921 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3" (UID: "d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 16 18:00:39.792582 kubelet[2921]: I0116 18:00:39.792556 2921 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3-whisker-backend-key-pair\") pod \"d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3\" (UID: \"d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3\") " Jan 16 18:00:39.792922 kubelet[2921]: I0116 18:00:39.792803 2921 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3-whisker-ca-bundle\") on node \"ci-4580-0-0-p-7f6b5ebc40\" DevicePath \"\"" Jan 16 18:00:39.795192 kubelet[2921]: I0116 18:00:39.795012 2921 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3-kube-api-access-5vnxx" (OuterVolumeSpecName: "kube-api-access-5vnxx") pod "d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3" (UID: "d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3"). InnerVolumeSpecName "kube-api-access-5vnxx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 16 18:00:39.795700 kubelet[2921]: I0116 18:00:39.795517 2921 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3" (UID: "d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 16 18:00:39.893616 kubelet[2921]: I0116 18:00:39.893562 2921 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3-whisker-backend-key-pair\") on node \"ci-4580-0-0-p-7f6b5ebc40\" DevicePath \"\"" Jan 16 18:00:39.893616 kubelet[2921]: I0116 18:00:39.893605 2921 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5vnxx\" (UniqueName: \"kubernetes.io/projected/d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3-kube-api-access-5vnxx\") on node \"ci-4580-0-0-p-7f6b5ebc40\" DevicePath \"\"" Jan 16 18:00:40.030447 systemd[1]: Removed slice kubepods-besteffort-podd1e9d740_842b_4bc4_a9fc_8b41a7b03ee3.slice - libcontainer container kubepods-besteffort-podd1e9d740_842b_4bc4_a9fc_8b41a7b03ee3.slice. Jan 16 18:00:40.043465 kubelet[2921]: I0116 18:00:40.042950 2921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-sx7js" podStartSLOduration=2.06262435 podStartE2EDuration="22.042932526s" podCreationTimestamp="2026-01-16 18:00:18 +0000 UTC" firstStartedPulling="2026-01-16 18:00:19.219929357 +0000 UTC m=+25.707716644" lastFinishedPulling="2026-01-16 18:00:39.200237533 +0000 UTC m=+45.688024820" observedRunningTime="2026-01-16 18:00:39.74385742 +0000 UTC m=+46.231644707" watchObservedRunningTime="2026-01-16 18:00:40.042932526 +0000 UTC m=+46.530719813" Jan 16 18:00:40.076139 systemd[1]: Created slice kubepods-besteffort-pod19ae2064_76f2_47ed_9968_9de33cfc7702.slice - libcontainer container kubepods-besteffort-pod19ae2064_76f2_47ed_9968_9de33cfc7702.slice. Jan 16 18:00:40.094632 kubelet[2921]: I0116 18:00:40.094553 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vqqc\" (UniqueName: \"kubernetes.io/projected/19ae2064-76f2-47ed-9968-9de33cfc7702-kube-api-access-6vqqc\") pod \"whisker-55c9fc4598-2nzbk\" (UID: \"19ae2064-76f2-47ed-9968-9de33cfc7702\") " pod="calico-system/whisker-55c9fc4598-2nzbk" Jan 16 18:00:40.094632 kubelet[2921]: I0116 18:00:40.094637 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/19ae2064-76f2-47ed-9968-9de33cfc7702-whisker-backend-key-pair\") pod \"whisker-55c9fc4598-2nzbk\" (UID: \"19ae2064-76f2-47ed-9968-9de33cfc7702\") " pod="calico-system/whisker-55c9fc4598-2nzbk" Jan 16 18:00:40.094786 kubelet[2921]: I0116 18:00:40.094698 2921 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19ae2064-76f2-47ed-9968-9de33cfc7702-whisker-ca-bundle\") pod \"whisker-55c9fc4598-2nzbk\" (UID: \"19ae2064-76f2-47ed-9968-9de33cfc7702\") " pod="calico-system/whisker-55c9fc4598-2nzbk" Jan 16 18:00:40.171755 systemd[1]: var-lib-kubelet-pods-d1e9d740\x2d842b\x2d4bc4\x2da9fc\x2d8b41a7b03ee3-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d5vnxx.mount: Deactivated successfully. Jan 16 18:00:40.171844 systemd[1]: var-lib-kubelet-pods-d1e9d740\x2d842b\x2d4bc4\x2da9fc\x2d8b41a7b03ee3-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 16 18:00:40.379850 containerd[1658]: time="2026-01-16T18:00:40.379590466Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-55c9fc4598-2nzbk,Uid:19ae2064-76f2-47ed-9968-9de33cfc7702,Namespace:calico-system,Attempt:0,}" Jan 16 18:00:40.505732 systemd-networkd[1574]: calib5009174f63: Link UP Jan 16 18:00:40.506722 systemd-networkd[1574]: calib5009174f63: Gained carrier Jan 16 18:00:40.517608 containerd[1658]: 2026-01-16 18:00:40.400 [INFO][4121] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 16 18:00:40.517608 containerd[1658]: 2026-01-16 18:00:40.419 [INFO][4121] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--7f6b5ebc40-k8s-whisker--55c9fc4598--2nzbk-eth0 whisker-55c9fc4598- calico-system 19ae2064-76f2-47ed-9968-9de33cfc7702 885 0 2026-01-16 18:00:40 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:55c9fc4598 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4580-0-0-p-7f6b5ebc40 whisker-55c9fc4598-2nzbk eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calib5009174f63 [] [] }} ContainerID="64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" Namespace="calico-system" Pod="whisker-55c9fc4598-2nzbk" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-whisker--55c9fc4598--2nzbk-" Jan 16 18:00:40.517608 containerd[1658]: 2026-01-16 18:00:40.419 [INFO][4121] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" Namespace="calico-system" Pod="whisker-55c9fc4598-2nzbk" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-whisker--55c9fc4598--2nzbk-eth0" Jan 16 18:00:40.517608 containerd[1658]: 2026-01-16 18:00:40.462 [INFO][4136] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" HandleID="k8s-pod-network.64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-whisker--55c9fc4598--2nzbk-eth0" Jan 16 18:00:40.518066 containerd[1658]: 2026-01-16 18:00:40.462 [INFO][4136] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" HandleID="k8s-pod-network.64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-whisker--55c9fc4598--2nzbk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000500ed0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4580-0-0-p-7f6b5ebc40", "pod":"whisker-55c9fc4598-2nzbk", "timestamp":"2026-01-16 18:00:40.462612078 +0000 UTC"}, Hostname:"ci-4580-0-0-p-7f6b5ebc40", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 18:00:40.518066 containerd[1658]: 2026-01-16 18:00:40.462 [INFO][4136] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 18:00:40.518066 containerd[1658]: 2026-01-16 18:00:40.462 [INFO][4136] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 18:00:40.518066 containerd[1658]: 2026-01-16 18:00:40.462 [INFO][4136] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-7f6b5ebc40' Jan 16 18:00:40.518066 containerd[1658]: 2026-01-16 18:00:40.472 [INFO][4136] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:40.518066 containerd[1658]: 2026-01-16 18:00:40.478 [INFO][4136] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:40.518066 containerd[1658]: 2026-01-16 18:00:40.481 [INFO][4136] ipam/ipam.go 511: Trying affinity for 192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:40.518066 containerd[1658]: 2026-01-16 18:00:40.483 [INFO][4136] ipam/ipam.go 158: Attempting to load block cidr=192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:40.518066 containerd[1658]: 2026-01-16 18:00:40.485 [INFO][4136] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:40.518290 containerd[1658]: 2026-01-16 18:00:40.485 [INFO][4136] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.85.128/26 handle="k8s-pod-network.64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:40.518290 containerd[1658]: 2026-01-16 18:00:40.486 [INFO][4136] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c Jan 16 18:00:40.518290 containerd[1658]: 2026-01-16 18:00:40.491 [INFO][4136] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.85.128/26 handle="k8s-pod-network.64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:40.518290 containerd[1658]: 2026-01-16 18:00:40.497 [INFO][4136] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.85.129/26] block=192.168.85.128/26 handle="k8s-pod-network.64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:40.518290 containerd[1658]: 2026-01-16 18:00:40.497 [INFO][4136] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.85.129/26] handle="k8s-pod-network.64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:40.518290 containerd[1658]: 2026-01-16 18:00:40.497 [INFO][4136] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 18:00:40.518290 containerd[1658]: 2026-01-16 18:00:40.497 [INFO][4136] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.85.129/26] IPv6=[] ContainerID="64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" HandleID="k8s-pod-network.64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-whisker--55c9fc4598--2nzbk-eth0" Jan 16 18:00:40.518547 containerd[1658]: 2026-01-16 18:00:40.499 [INFO][4121] cni-plugin/k8s.go 418: Populated endpoint ContainerID="64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" Namespace="calico-system" Pod="whisker-55c9fc4598-2nzbk" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-whisker--55c9fc4598--2nzbk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--7f6b5ebc40-k8s-whisker--55c9fc4598--2nzbk-eth0", GenerateName:"whisker-55c9fc4598-", Namespace:"calico-system", SelfLink:"", UID:"19ae2064-76f2-47ed-9968-9de33cfc7702", ResourceVersion:"885", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 18, 0, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"55c9fc4598", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-7f6b5ebc40", ContainerID:"", Pod:"whisker-55c9fc4598-2nzbk", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.85.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib5009174f63", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 18:00:40.518547 containerd[1658]: 2026-01-16 18:00:40.499 [INFO][4121] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.85.129/32] ContainerID="64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" Namespace="calico-system" Pod="whisker-55c9fc4598-2nzbk" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-whisker--55c9fc4598--2nzbk-eth0" Jan 16 18:00:40.518638 containerd[1658]: 2026-01-16 18:00:40.499 [INFO][4121] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib5009174f63 ContainerID="64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" Namespace="calico-system" Pod="whisker-55c9fc4598-2nzbk" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-whisker--55c9fc4598--2nzbk-eth0" Jan 16 18:00:40.518638 containerd[1658]: 2026-01-16 18:00:40.506 [INFO][4121] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" Namespace="calico-system" Pod="whisker-55c9fc4598-2nzbk" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-whisker--55c9fc4598--2nzbk-eth0" Jan 16 18:00:40.518694 containerd[1658]: 2026-01-16 18:00:40.507 [INFO][4121] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" Namespace="calico-system" Pod="whisker-55c9fc4598-2nzbk" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-whisker--55c9fc4598--2nzbk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--7f6b5ebc40-k8s-whisker--55c9fc4598--2nzbk-eth0", GenerateName:"whisker-55c9fc4598-", Namespace:"calico-system", SelfLink:"", UID:"19ae2064-76f2-47ed-9968-9de33cfc7702", ResourceVersion:"885", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 18, 0, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"55c9fc4598", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-7f6b5ebc40", ContainerID:"64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c", Pod:"whisker-55c9fc4598-2nzbk", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.85.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib5009174f63", MAC:"36:31:1a:73:18:0f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 18:00:40.518742 containerd[1658]: 2026-01-16 18:00:40.515 [INFO][4121] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" Namespace="calico-system" Pod="whisker-55c9fc4598-2nzbk" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-whisker--55c9fc4598--2nzbk-eth0" Jan 16 18:00:40.545144 containerd[1658]: time="2026-01-16T18:00:40.544875367Z" level=info msg="connecting to shim 64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c" address="unix:///run/containerd/s/a203ca94216e19540351cea980755e393a9ad1ad1966f9350ba4a65806bbb03d" namespace=k8s.io protocol=ttrpc version=3 Jan 16 18:00:40.566642 systemd[1]: Started cri-containerd-64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c.scope - libcontainer container 64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c. Jan 16 18:00:40.577000 audit: BPF prog-id=175 op=LOAD Jan 16 18:00:40.577000 audit: BPF prog-id=176 op=LOAD Jan 16 18:00:40.577000 audit[4172]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4161 pid=4172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:40.577000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634623734666637646334366462613566333561663765313033623131 Jan 16 18:00:40.578000 audit: BPF prog-id=176 op=UNLOAD Jan 16 18:00:40.578000 audit[4172]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4161 pid=4172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:40.578000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634623734666637646334366462613566333561663765313033623131 Jan 16 18:00:40.578000 audit: BPF prog-id=177 op=LOAD Jan 16 18:00:40.578000 audit[4172]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4161 pid=4172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:40.578000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634623734666637646334366462613566333561663765313033623131 Jan 16 18:00:40.578000 audit: BPF prog-id=178 op=LOAD Jan 16 18:00:40.578000 audit[4172]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4161 pid=4172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:40.578000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634623734666637646334366462613566333561663765313033623131 Jan 16 18:00:40.578000 audit: BPF prog-id=178 op=UNLOAD Jan 16 18:00:40.578000 audit[4172]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4161 pid=4172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:40.578000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634623734666637646334366462613566333561663765313033623131 Jan 16 18:00:40.578000 audit: BPF prog-id=177 op=UNLOAD Jan 16 18:00:40.578000 audit[4172]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4161 pid=4172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:40.578000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634623734666637646334366462613566333561663765313033623131 Jan 16 18:00:40.578000 audit: BPF prog-id=179 op=LOAD Jan 16 18:00:40.578000 audit[4172]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4161 pid=4172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:40.578000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634623734666637646334366462613566333561663765313033623131 Jan 16 18:00:40.601116 containerd[1658]: time="2026-01-16T18:00:40.601080657Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-55c9fc4598-2nzbk,Uid:19ae2064-76f2-47ed-9968-9de33cfc7702,Namespace:calico-system,Attempt:0,} returns sandbox id \"64b74ff7dc46dba5f35af7e103b117db2164955764f8312d10a55cbbc9879c7c\"" Jan 16 18:00:40.603165 containerd[1658]: time="2026-01-16T18:00:40.602853383Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 16 18:00:40.937080 containerd[1658]: time="2026-01-16T18:00:40.937031595Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:40.939770 containerd[1658]: time="2026-01-16T18:00:40.939725803Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 16 18:00:40.939871 containerd[1658]: time="2026-01-16T18:00:40.939747523Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:40.940498 kubelet[2921]: E0116 18:00:40.939975 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 18:00:40.940498 kubelet[2921]: E0116 18:00:40.940026 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 18:00:40.940817 kubelet[2921]: E0116 18:00:40.940232 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0b1f42d8785e424d85c2c2149fa8b5bd,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6vqqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-55c9fc4598-2nzbk_calico-system(19ae2064-76f2-47ed-9968-9de33cfc7702): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:40.942193 containerd[1658]: time="2026-01-16T18:00:40.942141731Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 16 18:00:41.019000 audit: BPF prog-id=180 op=LOAD Jan 16 18:00:41.019000 audit[4351]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffeb818668 a2=98 a3=ffffeb818658 items=0 ppid=4252 pid=4351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.019000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 18:00:41.019000 audit: BPF prog-id=180 op=UNLOAD Jan 16 18:00:41.019000 audit[4351]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffeb818638 a3=0 items=0 ppid=4252 pid=4351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.019000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 18:00:41.019000 audit: BPF prog-id=181 op=LOAD Jan 16 18:00:41.019000 audit[4351]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffeb818518 a2=74 a3=95 items=0 ppid=4252 pid=4351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.019000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 18:00:41.019000 audit: BPF prog-id=181 op=UNLOAD Jan 16 18:00:41.019000 audit[4351]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4252 pid=4351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.019000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 18:00:41.019000 audit: BPF prog-id=182 op=LOAD Jan 16 18:00:41.019000 audit[4351]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffeb818548 a2=40 a3=ffffeb818578 items=0 ppid=4252 pid=4351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.019000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 18:00:41.019000 audit: BPF prog-id=182 op=UNLOAD Jan 16 18:00:41.019000 audit[4351]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffeb818578 items=0 ppid=4252 pid=4351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.019000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 18:00:41.020000 audit: BPF prog-id=183 op=LOAD Jan 16 18:00:41.020000 audit[4352]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff901d3e8 a2=98 a3=fffff901d3d8 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.020000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.020000 audit: BPF prog-id=183 op=UNLOAD Jan 16 18:00:41.020000 audit[4352]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=fffff901d3b8 a3=0 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.020000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.020000 audit: BPF prog-id=184 op=LOAD Jan 16 18:00:41.020000 audit[4352]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffff901d078 a2=74 a3=95 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.020000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.020000 audit: BPF prog-id=184 op=UNLOAD Jan 16 18:00:41.020000 audit[4352]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.020000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.021000 audit: BPF prog-id=185 op=LOAD Jan 16 18:00:41.021000 audit[4352]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffff901d0d8 a2=94 a3=2 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.021000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.021000 audit: BPF prog-id=185 op=UNLOAD Jan 16 18:00:41.021000 audit[4352]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.021000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.120000 audit: BPF prog-id=186 op=LOAD Jan 16 18:00:41.120000 audit[4352]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffff901d098 a2=40 a3=fffff901d0c8 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.120000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.120000 audit: BPF prog-id=186 op=UNLOAD Jan 16 18:00:41.120000 audit[4352]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=fffff901d0c8 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.120000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.129000 audit: BPF prog-id=187 op=LOAD Jan 16 18:00:41.129000 audit[4352]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffff901d0a8 a2=94 a3=4 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.129000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.129000 audit: BPF prog-id=187 op=UNLOAD Jan 16 18:00:41.129000 audit[4352]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.129000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.130000 audit: BPF prog-id=188 op=LOAD Jan 16 18:00:41.130000 audit[4352]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff901cee8 a2=94 a3=5 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.130000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.130000 audit: BPF prog-id=188 op=UNLOAD Jan 16 18:00:41.130000 audit[4352]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.130000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.130000 audit: BPF prog-id=189 op=LOAD Jan 16 18:00:41.130000 audit[4352]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffff901d118 a2=94 a3=6 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.130000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.130000 audit: BPF prog-id=189 op=UNLOAD Jan 16 18:00:41.130000 audit[4352]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.130000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.130000 audit: BPF prog-id=190 op=LOAD Jan 16 18:00:41.130000 audit[4352]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffff901c8e8 a2=94 a3=83 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.130000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.130000 audit: BPF prog-id=191 op=LOAD Jan 16 18:00:41.130000 audit[4352]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=fffff901c6a8 a2=94 a3=2 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.130000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.130000 audit: BPF prog-id=191 op=UNLOAD Jan 16 18:00:41.130000 audit[4352]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.130000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.131000 audit: BPF prog-id=190 op=UNLOAD Jan 16 18:00:41.131000 audit[4352]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=ee63620 a3=ee56b00 items=0 ppid=4252 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.131000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 18:00:41.139000 audit: BPF prog-id=192 op=LOAD Jan 16 18:00:41.139000 audit[4355]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdd8708f8 a2=98 a3=ffffdd8708e8 items=0 ppid=4252 pid=4355 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.139000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 18:00:41.139000 audit: BPF prog-id=192 op=UNLOAD Jan 16 18:00:41.139000 audit[4355]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffdd8708c8 a3=0 items=0 ppid=4252 pid=4355 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.139000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 18:00:41.139000 audit: BPF prog-id=193 op=LOAD Jan 16 18:00:41.139000 audit[4355]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdd8707a8 a2=74 a3=95 items=0 ppid=4252 pid=4355 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.139000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 18:00:41.139000 audit: BPF prog-id=193 op=UNLOAD Jan 16 18:00:41.139000 audit[4355]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4252 pid=4355 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.139000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 18:00:41.139000 audit: BPF prog-id=194 op=LOAD Jan 16 18:00:41.139000 audit[4355]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdd8707d8 a2=40 a3=ffffdd870808 items=0 ppid=4252 pid=4355 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.139000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 18:00:41.139000 audit: BPF prog-id=194 op=UNLOAD Jan 16 18:00:41.139000 audit[4355]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffdd870808 items=0 ppid=4252 pid=4355 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.139000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 18:00:41.197401 systemd-networkd[1574]: vxlan.calico: Link UP Jan 16 18:00:41.197407 systemd-networkd[1574]: vxlan.calico: Gained carrier Jan 16 18:00:41.228000 audit: BPF prog-id=195 op=LOAD Jan 16 18:00:41.228000 audit[4379]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe9f2a848 a2=98 a3=ffffe9f2a838 items=0 ppid=4252 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.228000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 18:00:41.228000 audit: BPF prog-id=195 op=UNLOAD Jan 16 18:00:41.228000 audit[4379]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffe9f2a818 a3=0 items=0 ppid=4252 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.228000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 18:00:41.228000 audit: BPF prog-id=196 op=LOAD Jan 16 18:00:41.228000 audit[4379]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe9f2a528 a2=74 a3=95 items=0 ppid=4252 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.228000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 18:00:41.228000 audit: BPF prog-id=196 op=UNLOAD Jan 16 18:00:41.228000 audit[4379]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4252 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.228000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 18:00:41.228000 audit: BPF prog-id=197 op=LOAD Jan 16 18:00:41.228000 audit[4379]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe9f2a588 a2=94 a3=2 items=0 ppid=4252 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.228000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 18:00:41.228000 audit: BPF prog-id=197 op=UNLOAD Jan 16 18:00:41.228000 audit[4379]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=70 a3=2 items=0 ppid=4252 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.228000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 18:00:41.228000 audit: BPF prog-id=198 op=LOAD Jan 16 18:00:41.228000 audit[4379]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffe9f2a408 a2=40 a3=ffffe9f2a438 items=0 ppid=4252 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.228000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 18:00:41.228000 audit: BPF prog-id=198 op=UNLOAD Jan 16 18:00:41.228000 audit[4379]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=40 a3=ffffe9f2a438 items=0 ppid=4252 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.228000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 18:00:41.228000 audit: BPF prog-id=199 op=LOAD Jan 16 18:00:41.228000 audit[4379]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffe9f2a558 a2=94 a3=b7 items=0 ppid=4252 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.228000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 18:00:41.228000 audit: BPF prog-id=199 op=UNLOAD Jan 16 18:00:41.228000 audit[4379]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=b7 items=0 ppid=4252 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.228000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 18:00:41.229000 audit: BPF prog-id=200 op=LOAD Jan 16 18:00:41.229000 audit[4379]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffe9f29c08 a2=94 a3=2 items=0 ppid=4252 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.229000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 18:00:41.229000 audit: BPF prog-id=200 op=UNLOAD Jan 16 18:00:41.229000 audit[4379]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=2 items=0 ppid=4252 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.229000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 18:00:41.229000 audit: BPF prog-id=201 op=LOAD Jan 16 18:00:41.229000 audit[4379]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffe9f29d98 a2=94 a3=30 items=0 ppid=4252 pid=4379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.229000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 18:00:41.231000 audit: BPF prog-id=202 op=LOAD Jan 16 18:00:41.231000 audit[4383]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc0a71b28 a2=98 a3=ffffc0a71b18 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.231000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.231000 audit: BPF prog-id=202 op=UNLOAD Jan 16 18:00:41.231000 audit[4383]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffc0a71af8 a3=0 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.231000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.232000 audit: BPF prog-id=203 op=LOAD Jan 16 18:00:41.232000 audit[4383]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc0a717b8 a2=74 a3=95 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.232000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.232000 audit: BPF prog-id=203 op=UNLOAD Jan 16 18:00:41.232000 audit[4383]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.232000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.232000 audit: BPF prog-id=204 op=LOAD Jan 16 18:00:41.232000 audit[4383]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc0a71818 a2=94 a3=2 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.232000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.232000 audit: BPF prog-id=204 op=UNLOAD Jan 16 18:00:41.232000 audit[4383]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.232000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.289019 containerd[1658]: time="2026-01-16T18:00:41.288958821Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:41.295397 containerd[1658]: time="2026-01-16T18:00:41.295354041Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 16 18:00:41.295502 containerd[1658]: time="2026-01-16T18:00:41.295443401Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:41.295653 kubelet[2921]: E0116 18:00:41.295597 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 18:00:41.295653 kubelet[2921]: E0116 18:00:41.295642 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 18:00:41.295870 kubelet[2921]: E0116 18:00:41.295807 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6vqqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-55c9fc4598-2nzbk_calico-system(19ae2064-76f2-47ed-9968-9de33cfc7702): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:41.297163 kubelet[2921]: E0116 18:00:41.297105 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-55c9fc4598-2nzbk" podUID="19ae2064-76f2-47ed-9968-9de33cfc7702" Jan 16 18:00:41.335000 audit: BPF prog-id=205 op=LOAD Jan 16 18:00:41.335000 audit[4383]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc0a717d8 a2=40 a3=ffffc0a71808 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.335000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.335000 audit: BPF prog-id=205 op=UNLOAD Jan 16 18:00:41.335000 audit[4383]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffc0a71808 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.335000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.345000 audit: BPF prog-id=206 op=LOAD Jan 16 18:00:41.345000 audit[4383]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc0a717e8 a2=94 a3=4 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.345000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.345000 audit: BPF prog-id=206 op=UNLOAD Jan 16 18:00:41.345000 audit[4383]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.345000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.345000 audit: BPF prog-id=207 op=LOAD Jan 16 18:00:41.345000 audit[4383]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffc0a71628 a2=94 a3=5 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.345000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.345000 audit: BPF prog-id=207 op=UNLOAD Jan 16 18:00:41.345000 audit[4383]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.345000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.345000 audit: BPF prog-id=208 op=LOAD Jan 16 18:00:41.345000 audit[4383]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc0a71858 a2=94 a3=6 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.345000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.345000 audit: BPF prog-id=208 op=UNLOAD Jan 16 18:00:41.345000 audit[4383]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.345000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.345000 audit: BPF prog-id=209 op=LOAD Jan 16 18:00:41.345000 audit[4383]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc0a71028 a2=94 a3=83 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.345000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.346000 audit: BPF prog-id=210 op=LOAD Jan 16 18:00:41.346000 audit[4383]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffc0a70de8 a2=94 a3=2 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.346000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.346000 audit: BPF prog-id=210 op=UNLOAD Jan 16 18:00:41.346000 audit[4383]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.346000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.346000 audit: BPF prog-id=209 op=UNLOAD Jan 16 18:00:41.346000 audit[4383]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=16b7620 a3=16aab00 items=0 ppid=4252 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.346000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 18:00:41.359000 audit: BPF prog-id=201 op=UNLOAD Jan 16 18:00:41.359000 audit[4252]: SYSCALL arch=c00000b7 syscall=35 success=yes exit=0 a0=ffffffffffffff9c a1=4000c13200 a2=0 a3=0 items=0 ppid=4224 pid=4252 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.359000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 16 18:00:41.411000 audit[4412]: NETFILTER_CFG table=mangle:121 family=2 entries=16 op=nft_register_chain pid=4412 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 18:00:41.411000 audit[4412]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=fffff6aef150 a2=0 a3=ffffbc098fa8 items=0 ppid=4252 pid=4412 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.411000 audit[4411]: NETFILTER_CFG table=nat:122 family=2 entries=15 op=nft_register_chain pid=4411 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 18:00:41.411000 audit[4411]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=ffffe986d1a0 a2=0 a3=ffffa2b14fa8 items=0 ppid=4252 pid=4411 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.411000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 18:00:41.411000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 18:00:41.419000 audit[4410]: NETFILTER_CFG table=raw:123 family=2 entries=21 op=nft_register_chain pid=4410 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 18:00:41.419000 audit[4410]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=ffffe8f77580 a2=0 a3=ffff904fffa8 items=0 ppid=4252 pid=4410 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.419000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 18:00:41.427000 audit[4415]: NETFILTER_CFG table=filter:124 family=2 entries=94 op=nft_register_chain pid=4415 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 18:00:41.427000 audit[4415]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=53116 a0=3 a1=fffff890be70 a2=0 a3=ffffbaadbfa8 items=0 ppid=4252 pid=4415 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.427000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 18:00:41.603789 kubelet[2921]: I0116 18:00:41.603750 2921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3" path="/var/lib/kubelet/pods/d1e9d740-842b-4bc4-a9fc-8b41a7b03ee3/volumes" Jan 16 18:00:41.730570 kubelet[2921]: E0116 18:00:41.730339 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-55c9fc4598-2nzbk" podUID="19ae2064-76f2-47ed-9968-9de33cfc7702" Jan 16 18:00:41.755000 audit[4443]: NETFILTER_CFG table=filter:125 family=2 entries=20 op=nft_register_rule pid=4443 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:41.755000 audit[4443]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffcbf16d60 a2=0 a3=1 items=0 ppid=3072 pid=4443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.755000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:41.762000 audit[4443]: NETFILTER_CFG table=nat:126 family=2 entries=14 op=nft_register_rule pid=4443 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:41.762000 audit[4443]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffcbf16d60 a2=0 a3=1 items=0 ppid=3072 pid=4443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:41.762000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:41.905631 systemd-networkd[1574]: calib5009174f63: Gained IPv6LL Jan 16 18:00:42.993707 systemd-networkd[1574]: vxlan.calico: Gained IPv6LL Jan 16 18:00:47.602559 containerd[1658]: time="2026-01-16T18:00:47.602004149Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-5zv4w,Uid:be139626-cf50-4387-b6b7-3299ee34ffb2,Namespace:kube-system,Attempt:0,}" Jan 16 18:00:47.602559 containerd[1658]: time="2026-01-16T18:00:47.602450830Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cdd49cc55-ck2gc,Uid:2d37882f-c058-4d23-87c0-40e1fbaf0de7,Namespace:calico-apiserver,Attempt:0,}" Jan 16 18:00:47.727163 systemd-networkd[1574]: calib0289440d4e: Link UP Jan 16 18:00:47.727315 systemd-networkd[1574]: calib0289440d4e: Gained carrier Jan 16 18:00:47.740890 containerd[1658]: 2026-01-16 18:00:47.657 [INFO][4474] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ck2gc-eth0 calico-apiserver-5cdd49cc55- calico-apiserver 2d37882f-c058-4d23-87c0-40e1fbaf0de7 822 0 2026-01-16 18:00:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5cdd49cc55 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4580-0-0-p-7f6b5ebc40 calico-apiserver-5cdd49cc55-ck2gc eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib0289440d4e [] [] }} ContainerID="e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" Namespace="calico-apiserver" Pod="calico-apiserver-5cdd49cc55-ck2gc" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ck2gc-" Jan 16 18:00:47.740890 containerd[1658]: 2026-01-16 18:00:47.657 [INFO][4474] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" Namespace="calico-apiserver" Pod="calico-apiserver-5cdd49cc55-ck2gc" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ck2gc-eth0" Jan 16 18:00:47.740890 containerd[1658]: 2026-01-16 18:00:47.680 [INFO][4491] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" HandleID="k8s-pod-network.e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ck2gc-eth0" Jan 16 18:00:47.741199 containerd[1658]: 2026-01-16 18:00:47.681 [INFO][4491] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" HandleID="k8s-pod-network.e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ck2gc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400050eb30), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4580-0-0-p-7f6b5ebc40", "pod":"calico-apiserver-5cdd49cc55-ck2gc", "timestamp":"2026-01-16 18:00:47.680913188 +0000 UTC"}, Hostname:"ci-4580-0-0-p-7f6b5ebc40", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 18:00:47.741199 containerd[1658]: 2026-01-16 18:00:47.681 [INFO][4491] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 18:00:47.741199 containerd[1658]: 2026-01-16 18:00:47.681 [INFO][4491] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 18:00:47.741199 containerd[1658]: 2026-01-16 18:00:47.681 [INFO][4491] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-7f6b5ebc40' Jan 16 18:00:47.741199 containerd[1658]: 2026-01-16 18:00:47.690 [INFO][4491] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.741199 containerd[1658]: 2026-01-16 18:00:47.694 [INFO][4491] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.741199 containerd[1658]: 2026-01-16 18:00:47.699 [INFO][4491] ipam/ipam.go 511: Trying affinity for 192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.741199 containerd[1658]: 2026-01-16 18:00:47.701 [INFO][4491] ipam/ipam.go 158: Attempting to load block cidr=192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.741199 containerd[1658]: 2026-01-16 18:00:47.704 [INFO][4491] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.741463 containerd[1658]: 2026-01-16 18:00:47.704 [INFO][4491] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.85.128/26 handle="k8s-pod-network.e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.741463 containerd[1658]: 2026-01-16 18:00:47.705 [INFO][4491] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6 Jan 16 18:00:47.741463 containerd[1658]: 2026-01-16 18:00:47.709 [INFO][4491] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.85.128/26 handle="k8s-pod-network.e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.741463 containerd[1658]: 2026-01-16 18:00:47.715 [INFO][4491] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.85.130/26] block=192.168.85.128/26 handle="k8s-pod-network.e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.741463 containerd[1658]: 2026-01-16 18:00:47.715 [INFO][4491] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.85.130/26] handle="k8s-pod-network.e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.741463 containerd[1658]: 2026-01-16 18:00:47.715 [INFO][4491] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 18:00:47.741463 containerd[1658]: 2026-01-16 18:00:47.715 [INFO][4491] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.85.130/26] IPv6=[] ContainerID="e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" HandleID="k8s-pod-network.e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ck2gc-eth0" Jan 16 18:00:47.741617 containerd[1658]: 2026-01-16 18:00:47.722 [INFO][4474] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" Namespace="calico-apiserver" Pod="calico-apiserver-5cdd49cc55-ck2gc" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ck2gc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ck2gc-eth0", GenerateName:"calico-apiserver-5cdd49cc55-", Namespace:"calico-apiserver", SelfLink:"", UID:"2d37882f-c058-4d23-87c0-40e1fbaf0de7", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 18, 0, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5cdd49cc55", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-7f6b5ebc40", ContainerID:"", Pod:"calico-apiserver-5cdd49cc55-ck2gc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.85.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib0289440d4e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 18:00:47.741667 containerd[1658]: 2026-01-16 18:00:47.722 [INFO][4474] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.85.130/32] ContainerID="e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" Namespace="calico-apiserver" Pod="calico-apiserver-5cdd49cc55-ck2gc" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ck2gc-eth0" Jan 16 18:00:47.741667 containerd[1658]: 2026-01-16 18:00:47.722 [INFO][4474] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib0289440d4e ContainerID="e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" Namespace="calico-apiserver" Pod="calico-apiserver-5cdd49cc55-ck2gc" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ck2gc-eth0" Jan 16 18:00:47.741667 containerd[1658]: 2026-01-16 18:00:47.726 [INFO][4474] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" Namespace="calico-apiserver" Pod="calico-apiserver-5cdd49cc55-ck2gc" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ck2gc-eth0" Jan 16 18:00:47.741727 containerd[1658]: 2026-01-16 18:00:47.727 [INFO][4474] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" Namespace="calico-apiserver" Pod="calico-apiserver-5cdd49cc55-ck2gc" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ck2gc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ck2gc-eth0", GenerateName:"calico-apiserver-5cdd49cc55-", Namespace:"calico-apiserver", SelfLink:"", UID:"2d37882f-c058-4d23-87c0-40e1fbaf0de7", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 18, 0, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5cdd49cc55", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-7f6b5ebc40", ContainerID:"e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6", Pod:"calico-apiserver-5cdd49cc55-ck2gc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.85.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib0289440d4e", MAC:"16:02:06:da:b3:de", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 18:00:47.741772 containerd[1658]: 2026-01-16 18:00:47.737 [INFO][4474] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" Namespace="calico-apiserver" Pod="calico-apiserver-5cdd49cc55-ck2gc" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ck2gc-eth0" Jan 16 18:00:47.752000 audit[4518]: NETFILTER_CFG table=filter:127 family=2 entries=50 op=nft_register_chain pid=4518 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 18:00:47.755467 kernel: kauditd_printk_skb: 231 callbacks suppressed Jan 16 18:00:47.755520 kernel: audit: type=1325 audit(1768586447.752:662): table=filter:127 family=2 entries=50 op=nft_register_chain pid=4518 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 18:00:47.752000 audit[4518]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=28208 a0=3 a1=ffffe5da2e30 a2=0 a3=ffff9e98efa8 items=0 ppid=4252 pid=4518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.759684 kernel: audit: type=1300 audit(1768586447.752:662): arch=c00000b7 syscall=211 success=yes exit=28208 a0=3 a1=ffffe5da2e30 a2=0 a3=ffff9e98efa8 items=0 ppid=4252 pid=4518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.752000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 18:00:47.761953 kernel: audit: type=1327 audit(1768586447.752:662): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 18:00:47.769009 containerd[1658]: time="2026-01-16T18:00:47.768964655Z" level=info msg="connecting to shim e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6" address="unix:///run/containerd/s/c820bef79fa6fdd43f7d85b820cc18883c53ecb8c452c81f2b44f626200463f8" namespace=k8s.io protocol=ttrpc version=3 Jan 16 18:00:47.796635 systemd[1]: Started cri-containerd-e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6.scope - libcontainer container e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6. Jan 16 18:00:47.809000 audit: BPF prog-id=211 op=LOAD Jan 16 18:00:47.809000 audit: BPF prog-id=212 op=LOAD Jan 16 18:00:47.809000 audit[4539]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4528 pid=4539 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.814673 kernel: audit: type=1334 audit(1768586447.809:663): prog-id=211 op=LOAD Jan 16 18:00:47.814776 kernel: audit: type=1334 audit(1768586447.809:664): prog-id=212 op=LOAD Jan 16 18:00:47.814832 kernel: audit: type=1300 audit(1768586447.809:664): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4528 pid=4539 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.814878 kernel: audit: type=1327 audit(1768586447.809:664): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533366134666162623763363532623164343765366630613832633266 Jan 16 18:00:47.809000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533366134666162623763363532623164343765366630613832633266 Jan 16 18:00:47.810000 audit: BPF prog-id=212 op=UNLOAD Jan 16 18:00:47.819118 kernel: audit: type=1334 audit(1768586447.810:665): prog-id=212 op=UNLOAD Jan 16 18:00:47.819455 kernel: audit: type=1300 audit(1768586447.810:665): arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4528 pid=4539 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.810000 audit[4539]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4528 pid=4539 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.810000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533366134666162623763363532623164343765366630613832633266 Jan 16 18:00:47.825874 kernel: audit: type=1327 audit(1768586447.810:665): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533366134666162623763363532623164343765366630613832633266 Jan 16 18:00:47.810000 audit: BPF prog-id=213 op=LOAD Jan 16 18:00:47.810000 audit[4539]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4528 pid=4539 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.810000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533366134666162623763363532623164343765366630613832633266 Jan 16 18:00:47.811000 audit: BPF prog-id=214 op=LOAD Jan 16 18:00:47.811000 audit[4539]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4528 pid=4539 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.811000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533366134666162623763363532623164343765366630613832633266 Jan 16 18:00:47.817000 audit: BPF prog-id=214 op=UNLOAD Jan 16 18:00:47.817000 audit[4539]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4528 pid=4539 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533366134666162623763363532623164343765366630613832633266 Jan 16 18:00:47.817000 audit: BPF prog-id=213 op=UNLOAD Jan 16 18:00:47.817000 audit[4539]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4528 pid=4539 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533366134666162623763363532623164343765366630613832633266 Jan 16 18:00:47.817000 audit: BPF prog-id=215 op=LOAD Jan 16 18:00:47.817000 audit[4539]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4528 pid=4539 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533366134666162623763363532623164343765366630613832633266 Jan 16 18:00:47.833093 systemd-networkd[1574]: califbc2e79502c: Link UP Jan 16 18:00:47.834104 systemd-networkd[1574]: califbc2e79502c: Gained carrier Jan 16 18:00:47.851200 containerd[1658]: 2026-01-16 18:00:47.657 [INFO][4464] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--5zv4w-eth0 coredns-668d6bf9bc- kube-system be139626-cf50-4387-b6b7-3299ee34ffb2 826 0 2026-01-16 18:00:00 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4580-0-0-p-7f6b5ebc40 coredns-668d6bf9bc-5zv4w eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] califbc2e79502c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" Namespace="kube-system" Pod="coredns-668d6bf9bc-5zv4w" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--5zv4w-" Jan 16 18:00:47.851200 containerd[1658]: 2026-01-16 18:00:47.658 [INFO][4464] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" Namespace="kube-system" Pod="coredns-668d6bf9bc-5zv4w" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--5zv4w-eth0" Jan 16 18:00:47.851200 containerd[1658]: 2026-01-16 18:00:47.681 [INFO][4492] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" HandleID="k8s-pod-network.d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--5zv4w-eth0" Jan 16 18:00:47.851399 containerd[1658]: 2026-01-16 18:00:47.681 [INFO][4492] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" HandleID="k8s-pod-network.d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--5zv4w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000514ac0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4580-0-0-p-7f6b5ebc40", "pod":"coredns-668d6bf9bc-5zv4w", "timestamp":"2026-01-16 18:00:47.680996068 +0000 UTC"}, Hostname:"ci-4580-0-0-p-7f6b5ebc40", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 18:00:47.851399 containerd[1658]: 2026-01-16 18:00:47.681 [INFO][4492] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 18:00:47.851399 containerd[1658]: 2026-01-16 18:00:47.715 [INFO][4492] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 18:00:47.851399 containerd[1658]: 2026-01-16 18:00:47.715 [INFO][4492] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-7f6b5ebc40' Jan 16 18:00:47.851399 containerd[1658]: 2026-01-16 18:00:47.791 [INFO][4492] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.851399 containerd[1658]: 2026-01-16 18:00:47.795 [INFO][4492] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.851399 containerd[1658]: 2026-01-16 18:00:47.801 [INFO][4492] ipam/ipam.go 511: Trying affinity for 192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.851399 containerd[1658]: 2026-01-16 18:00:47.803 [INFO][4492] ipam/ipam.go 158: Attempting to load block cidr=192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.851399 containerd[1658]: 2026-01-16 18:00:47.805 [INFO][4492] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.851669 containerd[1658]: 2026-01-16 18:00:47.805 [INFO][4492] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.85.128/26 handle="k8s-pod-network.d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.851669 containerd[1658]: 2026-01-16 18:00:47.807 [INFO][4492] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d Jan 16 18:00:47.851669 containerd[1658]: 2026-01-16 18:00:47.818 [INFO][4492] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.85.128/26 handle="k8s-pod-network.d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.851669 containerd[1658]: 2026-01-16 18:00:47.827 [INFO][4492] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.85.131/26] block=192.168.85.128/26 handle="k8s-pod-network.d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.851669 containerd[1658]: 2026-01-16 18:00:47.827 [INFO][4492] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.85.131/26] handle="k8s-pod-network.d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:47.851669 containerd[1658]: 2026-01-16 18:00:47.827 [INFO][4492] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 18:00:47.851669 containerd[1658]: 2026-01-16 18:00:47.827 [INFO][4492] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.85.131/26] IPv6=[] ContainerID="d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" HandleID="k8s-pod-network.d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--5zv4w-eth0" Jan 16 18:00:47.851924 containerd[1658]: 2026-01-16 18:00:47.831 [INFO][4464] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" Namespace="kube-system" Pod="coredns-668d6bf9bc-5zv4w" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--5zv4w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--5zv4w-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"be139626-cf50-4387-b6b7-3299ee34ffb2", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 18, 0, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-7f6b5ebc40", ContainerID:"", Pod:"coredns-668d6bf9bc-5zv4w", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.85.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califbc2e79502c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 18:00:47.851924 containerd[1658]: 2026-01-16 18:00:47.831 [INFO][4464] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.85.131/32] ContainerID="d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" Namespace="kube-system" Pod="coredns-668d6bf9bc-5zv4w" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--5zv4w-eth0" Jan 16 18:00:47.851924 containerd[1658]: 2026-01-16 18:00:47.831 [INFO][4464] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califbc2e79502c ContainerID="d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" Namespace="kube-system" Pod="coredns-668d6bf9bc-5zv4w" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--5zv4w-eth0" Jan 16 18:00:47.851924 containerd[1658]: 2026-01-16 18:00:47.833 [INFO][4464] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" Namespace="kube-system" Pod="coredns-668d6bf9bc-5zv4w" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--5zv4w-eth0" Jan 16 18:00:47.851924 containerd[1658]: 2026-01-16 18:00:47.835 [INFO][4464] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" Namespace="kube-system" Pod="coredns-668d6bf9bc-5zv4w" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--5zv4w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--5zv4w-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"be139626-cf50-4387-b6b7-3299ee34ffb2", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 18, 0, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-7f6b5ebc40", ContainerID:"d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d", Pod:"coredns-668d6bf9bc-5zv4w", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.85.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califbc2e79502c", MAC:"76:0e:e3:20:af:65", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 18:00:47.851924 containerd[1658]: 2026-01-16 18:00:47.847 [INFO][4464] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" Namespace="kube-system" Pod="coredns-668d6bf9bc-5zv4w" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--5zv4w-eth0" Jan 16 18:00:47.860119 containerd[1658]: time="2026-01-16T18:00:47.859358969Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cdd49cc55-ck2gc,Uid:2d37882f-c058-4d23-87c0-40e1fbaf0de7,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e36a4fabb7c652b1d47e6f0a82c2f082aade0a79cc9036735ec24ba759d621f6\"" Jan 16 18:00:47.861470 containerd[1658]: time="2026-01-16T18:00:47.861344815Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 18:00:47.865000 audit[4575]: NETFILTER_CFG table=filter:128 family=2 entries=52 op=nft_register_chain pid=4575 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 18:00:47.865000 audit[4575]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=26592 a0=3 a1=ffffd71ba840 a2=0 a3=ffffb9b07fa8 items=0 ppid=4252 pid=4575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.865000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 18:00:47.883690 containerd[1658]: time="2026-01-16T18:00:47.883640122Z" level=info msg="connecting to shim d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d" address="unix:///run/containerd/s/0084b6d5fa810503cb8dd1f071b1f20970fd73828e466572a0ed6383e2421e45" namespace=k8s.io protocol=ttrpc version=3 Jan 16 18:00:47.912810 systemd[1]: Started cri-containerd-d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d.scope - libcontainer container d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d. Jan 16 18:00:47.922000 audit: BPF prog-id=216 op=LOAD Jan 16 18:00:47.922000 audit: BPF prog-id=217 op=LOAD Jan 16 18:00:47.922000 audit[4595]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4584 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.922000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434326666623933323435336231633462613732363734333437313838 Jan 16 18:00:47.922000 audit: BPF prog-id=217 op=UNLOAD Jan 16 18:00:47.922000 audit[4595]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4584 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.922000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434326666623933323435336231633462613732363734333437313838 Jan 16 18:00:47.922000 audit: BPF prog-id=218 op=LOAD Jan 16 18:00:47.922000 audit[4595]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4584 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.922000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434326666623933323435336231633462613732363734333437313838 Jan 16 18:00:47.922000 audit: BPF prog-id=219 op=LOAD Jan 16 18:00:47.922000 audit[4595]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4584 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.922000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434326666623933323435336231633462613732363734333437313838 Jan 16 18:00:47.922000 audit: BPF prog-id=219 op=UNLOAD Jan 16 18:00:47.922000 audit[4595]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4584 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.922000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434326666623933323435336231633462613732363734333437313838 Jan 16 18:00:47.922000 audit: BPF prog-id=218 op=UNLOAD Jan 16 18:00:47.922000 audit[4595]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4584 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.922000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434326666623933323435336231633462613732363734333437313838 Jan 16 18:00:47.922000 audit: BPF prog-id=220 op=LOAD Jan 16 18:00:47.922000 audit[4595]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4584 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.922000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6434326666623933323435336231633462613732363734333437313838 Jan 16 18:00:47.947533 containerd[1658]: time="2026-01-16T18:00:47.947487516Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-5zv4w,Uid:be139626-cf50-4387-b6b7-3299ee34ffb2,Namespace:kube-system,Attempt:0,} returns sandbox id \"d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d\"" Jan 16 18:00:47.950832 containerd[1658]: time="2026-01-16T18:00:47.950800166Z" level=info msg="CreateContainer within sandbox \"d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 16 18:00:47.960576 containerd[1658]: time="2026-01-16T18:00:47.960523435Z" level=info msg="Container b840b2eb08c66cb920f93e44e4d4c945ac34e271f7d845874dd6b4c50caf4809: CDI devices from CRI Config.CDIDevices: []" Jan 16 18:00:47.967130 containerd[1658]: time="2026-01-16T18:00:47.967082615Z" level=info msg="CreateContainer within sandbox \"d42ffb932453b1c4ba726743471881822c8767b2fa22eac0e1c4371041a0925d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b840b2eb08c66cb920f93e44e4d4c945ac34e271f7d845874dd6b4c50caf4809\"" Jan 16 18:00:47.967737 containerd[1658]: time="2026-01-16T18:00:47.967694977Z" level=info msg="StartContainer for \"b840b2eb08c66cb920f93e44e4d4c945ac34e271f7d845874dd6b4c50caf4809\"" Jan 16 18:00:47.969072 containerd[1658]: time="2026-01-16T18:00:47.968731780Z" level=info msg="connecting to shim b840b2eb08c66cb920f93e44e4d4c945ac34e271f7d845874dd6b4c50caf4809" address="unix:///run/containerd/s/0084b6d5fa810503cb8dd1f071b1f20970fd73828e466572a0ed6383e2421e45" protocol=ttrpc version=3 Jan 16 18:00:47.988660 systemd[1]: Started cri-containerd-b840b2eb08c66cb920f93e44e4d4c945ac34e271f7d845874dd6b4c50caf4809.scope - libcontainer container b840b2eb08c66cb920f93e44e4d4c945ac34e271f7d845874dd6b4c50caf4809. Jan 16 18:00:47.998000 audit: BPF prog-id=221 op=LOAD Jan 16 18:00:47.999000 audit: BPF prog-id=222 op=LOAD Jan 16 18:00:47.999000 audit[4620]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4584 pid=4620 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.999000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6238343062326562303863363663623932306639336534346534643463 Jan 16 18:00:47.999000 audit: BPF prog-id=222 op=UNLOAD Jan 16 18:00:47.999000 audit[4620]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4584 pid=4620 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.999000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6238343062326562303863363663623932306639336534346534643463 Jan 16 18:00:47.999000 audit: BPF prog-id=223 op=LOAD Jan 16 18:00:47.999000 audit[4620]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4584 pid=4620 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.999000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6238343062326562303863363663623932306639336534346534643463 Jan 16 18:00:47.999000 audit: BPF prog-id=224 op=LOAD Jan 16 18:00:47.999000 audit[4620]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4584 pid=4620 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.999000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6238343062326562303863363663623932306639336534346534643463 Jan 16 18:00:47.999000 audit: BPF prog-id=224 op=UNLOAD Jan 16 18:00:47.999000 audit[4620]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4584 pid=4620 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.999000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6238343062326562303863363663623932306639336534346534643463 Jan 16 18:00:47.999000 audit: BPF prog-id=223 op=UNLOAD Jan 16 18:00:47.999000 audit[4620]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4584 pid=4620 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.999000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6238343062326562303863363663623932306639336534346534643463 Jan 16 18:00:47.999000 audit: BPF prog-id=225 op=LOAD Jan 16 18:00:47.999000 audit[4620]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4584 pid=4620 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:47.999000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6238343062326562303863363663623932306639336534346534643463 Jan 16 18:00:48.016522 containerd[1658]: time="2026-01-16T18:00:48.016489005Z" level=info msg="StartContainer for \"b840b2eb08c66cb920f93e44e4d4c945ac34e271f7d845874dd6b4c50caf4809\" returns successfully" Jan 16 18:00:48.195150 containerd[1658]: time="2026-01-16T18:00:48.195098466Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:48.196322 containerd[1658]: time="2026-01-16T18:00:48.196287429Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 18:00:48.196415 containerd[1658]: time="2026-01-16T18:00:48.196308229Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:48.196609 kubelet[2921]: E0116 18:00:48.196558 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:00:48.197128 kubelet[2921]: E0116 18:00:48.196709 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:00:48.197354 kubelet[2921]: E0116 18:00:48.197267 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdp42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5cdd49cc55-ck2gc_calico-apiserver(2d37882f-c058-4d23-87c0-40e1fbaf0de7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:48.198517 kubelet[2921]: E0116 18:00:48.198472 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:00:48.602843 containerd[1658]: time="2026-01-16T18:00:48.602451700Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-t28mh,Uid:2f2a1d19-8f6a-44c8-adb8-cc33e1ecebce,Namespace:kube-system,Attempt:0,}" Jan 16 18:00:48.603922 containerd[1658]: time="2026-01-16T18:00:48.603869264Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-j5dxx,Uid:64da405a-12aa-44ec-8b4d-a44866f591ec,Namespace:calico-system,Attempt:0,}" Jan 16 18:00:48.731862 systemd-networkd[1574]: cali185265902c8: Link UP Jan 16 18:00:48.732893 systemd-networkd[1574]: cali185265902c8: Gained carrier Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.657 [INFO][4655] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--t28mh-eth0 coredns-668d6bf9bc- kube-system 2f2a1d19-8f6a-44c8-adb8-cc33e1ecebce 820 0 2026-01-16 18:00:00 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4580-0-0-p-7f6b5ebc40 coredns-668d6bf9bc-t28mh eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali185265902c8 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" Namespace="kube-system" Pod="coredns-668d6bf9bc-t28mh" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--t28mh-" Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.657 [INFO][4655] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" Namespace="kube-system" Pod="coredns-668d6bf9bc-t28mh" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--t28mh-eth0" Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.690 [INFO][4682] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" HandleID="k8s-pod-network.409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--t28mh-eth0" Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.690 [INFO][4682] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" HandleID="k8s-pod-network.409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--t28mh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000340460), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4580-0-0-p-7f6b5ebc40", "pod":"coredns-668d6bf9bc-t28mh", "timestamp":"2026-01-16 18:00:48.690142286 +0000 UTC"}, Hostname:"ci-4580-0-0-p-7f6b5ebc40", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.690 [INFO][4682] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.690 [INFO][4682] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.690 [INFO][4682] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-7f6b5ebc40' Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.699 [INFO][4682] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.704 [INFO][4682] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.709 [INFO][4682] ipam/ipam.go 511: Trying affinity for 192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.711 [INFO][4682] ipam/ipam.go 158: Attempting to load block cidr=192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.714 [INFO][4682] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.714 [INFO][4682] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.85.128/26 handle="k8s-pod-network.409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.716 [INFO][4682] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83 Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.720 [INFO][4682] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.85.128/26 handle="k8s-pod-network.409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.727 [INFO][4682] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.85.132/26] block=192.168.85.128/26 handle="k8s-pod-network.409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.727 [INFO][4682] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.85.132/26] handle="k8s-pod-network.409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.727 [INFO][4682] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 18:00:48.751300 containerd[1658]: 2026-01-16 18:00:48.727 [INFO][4682] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.85.132/26] IPv6=[] ContainerID="409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" HandleID="k8s-pod-network.409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--t28mh-eth0" Jan 16 18:00:48.752129 containerd[1658]: 2026-01-16 18:00:48.729 [INFO][4655] cni-plugin/k8s.go 418: Populated endpoint ContainerID="409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" Namespace="kube-system" Pod="coredns-668d6bf9bc-t28mh" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--t28mh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--t28mh-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"2f2a1d19-8f6a-44c8-adb8-cc33e1ecebce", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 18, 0, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-7f6b5ebc40", ContainerID:"", Pod:"coredns-668d6bf9bc-t28mh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.85.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali185265902c8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 18:00:48.752129 containerd[1658]: 2026-01-16 18:00:48.729 [INFO][4655] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.85.132/32] ContainerID="409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" Namespace="kube-system" Pod="coredns-668d6bf9bc-t28mh" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--t28mh-eth0" Jan 16 18:00:48.752129 containerd[1658]: 2026-01-16 18:00:48.730 [INFO][4655] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali185265902c8 ContainerID="409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" Namespace="kube-system" Pod="coredns-668d6bf9bc-t28mh" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--t28mh-eth0" Jan 16 18:00:48.752129 containerd[1658]: 2026-01-16 18:00:48.733 [INFO][4655] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" Namespace="kube-system" Pod="coredns-668d6bf9bc-t28mh" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--t28mh-eth0" Jan 16 18:00:48.752129 containerd[1658]: 2026-01-16 18:00:48.733 [INFO][4655] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" Namespace="kube-system" Pod="coredns-668d6bf9bc-t28mh" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--t28mh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--t28mh-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"2f2a1d19-8f6a-44c8-adb8-cc33e1ecebce", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 18, 0, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-7f6b5ebc40", ContainerID:"409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83", Pod:"coredns-668d6bf9bc-t28mh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.85.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali185265902c8", MAC:"5a:0c:43:3d:3b:13", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 18:00:48.752129 containerd[1658]: 2026-01-16 18:00:48.747 [INFO][4655] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" Namespace="kube-system" Pod="coredns-668d6bf9bc-t28mh" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-coredns--668d6bf9bc--t28mh-eth0" Jan 16 18:00:48.753118 kubelet[2921]: E0116 18:00:48.753080 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:00:48.761951 kubelet[2921]: I0116 18:00:48.761863 2921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-5zv4w" podStartSLOduration=48.761845383 podStartE2EDuration="48.761845383s" podCreationTimestamp="2026-01-16 18:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 18:00:48.761071421 +0000 UTC m=+55.248858748" watchObservedRunningTime="2026-01-16 18:00:48.761845383 +0000 UTC m=+55.249632670" Jan 16 18:00:48.769000 audit[4708]: NETFILTER_CFG table=filter:129 family=2 entries=36 op=nft_register_chain pid=4708 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 18:00:48.769000 audit[4708]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19140 a0=3 a1=ffffea677a80 a2=0 a3=ffff81ca9fa8 items=0 ppid=4252 pid=4708 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.769000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 18:00:48.785000 audit[4711]: NETFILTER_CFG table=filter:130 family=2 entries=20 op=nft_register_rule pid=4711 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:48.785000 audit[4711]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffffcd69a90 a2=0 a3=1 items=0 ppid=3072 pid=4711 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.785000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:48.789605 containerd[1658]: time="2026-01-16T18:00:48.789563427Z" level=info msg="connecting to shim 409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83" address="unix:///run/containerd/s/0a28c2ccd6846098fac280c320d189a048efb377455dfcf2409007911f5924a7" namespace=k8s.io protocol=ttrpc version=3 Jan 16 18:00:48.793000 audit[4711]: NETFILTER_CFG table=nat:131 family=2 entries=14 op=nft_register_rule pid=4711 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:48.793000 audit[4711]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=fffffcd69a90 a2=0 a3=1 items=0 ppid=3072 pid=4711 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.793000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:48.816000 audit[4742]: NETFILTER_CFG table=filter:132 family=2 entries=17 op=nft_register_rule pid=4742 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:48.816000 audit[4742]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff6b59820 a2=0 a3=1 items=0 ppid=3072 pid=4742 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.816000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:48.818658 systemd[1]: Started cri-containerd-409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83.scope - libcontainer container 409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83. Jan 16 18:00:48.822000 audit[4742]: NETFILTER_CFG table=nat:133 family=2 entries=35 op=nft_register_chain pid=4742 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:48.822000 audit[4742]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=fffff6b59820 a2=0 a3=1 items=0 ppid=3072 pid=4742 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.822000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:48.831000 audit: BPF prog-id=226 op=LOAD Jan 16 18:00:48.832000 audit: BPF prog-id=227 op=LOAD Jan 16 18:00:48.832000 audit[4730]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=400010c180 a2=98 a3=0 items=0 ppid=4719 pid=4730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.832000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430396363333032386434306331653633626632306566663665313663 Jan 16 18:00:48.832000 audit: BPF prog-id=227 op=UNLOAD Jan 16 18:00:48.832000 audit[4730]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4719 pid=4730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.832000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430396363333032386434306331653633626632306566663665313663 Jan 16 18:00:48.832000 audit: BPF prog-id=228 op=LOAD Jan 16 18:00:48.832000 audit[4730]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=400010c3e8 a2=98 a3=0 items=0 ppid=4719 pid=4730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.832000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430396363333032386434306331653633626632306566663665313663 Jan 16 18:00:48.832000 audit: BPF prog-id=229 op=LOAD Jan 16 18:00:48.832000 audit[4730]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=400010c168 a2=98 a3=0 items=0 ppid=4719 pid=4730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.832000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430396363333032386434306331653633626632306566663665313663 Jan 16 18:00:48.832000 audit: BPF prog-id=229 op=UNLOAD Jan 16 18:00:48.832000 audit[4730]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4719 pid=4730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.832000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430396363333032386434306331653633626632306566663665313663 Jan 16 18:00:48.832000 audit: BPF prog-id=228 op=UNLOAD Jan 16 18:00:48.832000 audit[4730]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4719 pid=4730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.832000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430396363333032386434306331653633626632306566663665313663 Jan 16 18:00:48.832000 audit: BPF prog-id=230 op=LOAD Jan 16 18:00:48.832000 audit[4730]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=400010c648 a2=98 a3=0 items=0 ppid=4719 pid=4730 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.832000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430396363333032386434306331653633626632306566663665313663 Jan 16 18:00:48.852664 systemd-networkd[1574]: cali5d7b09a364e: Link UP Jan 16 18:00:48.853799 systemd-networkd[1574]: cali5d7b09a364e: Gained carrier Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.663 [INFO][4667] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--7f6b5ebc40-k8s-goldmane--666569f655--j5dxx-eth0 goldmane-666569f655- calico-system 64da405a-12aa-44ec-8b4d-a44866f591ec 825 0 2026-01-16 18:00:16 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4580-0-0-p-7f6b5ebc40 goldmane-666569f655-j5dxx eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali5d7b09a364e [] [] }} ContainerID="09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" Namespace="calico-system" Pod="goldmane-666569f655-j5dxx" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-goldmane--666569f655--j5dxx-" Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.663 [INFO][4667] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" Namespace="calico-system" Pod="goldmane-666569f655-j5dxx" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-goldmane--666569f655--j5dxx-eth0" Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.696 [INFO][4689] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" HandleID="k8s-pod-network.09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-goldmane--666569f655--j5dxx-eth0" Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.696 [INFO][4689] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" HandleID="k8s-pod-network.09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-goldmane--666569f655--j5dxx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000598890), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4580-0-0-p-7f6b5ebc40", "pod":"goldmane-666569f655-j5dxx", "timestamp":"2026-01-16 18:00:48.696191944 +0000 UTC"}, Hostname:"ci-4580-0-0-p-7f6b5ebc40", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.696 [INFO][4689] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.727 [INFO][4689] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.727 [INFO][4689] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-7f6b5ebc40' Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.803 [INFO][4689] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.811 [INFO][4689] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.817 [INFO][4689] ipam/ipam.go 511: Trying affinity for 192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.820 [INFO][4689] ipam/ipam.go 158: Attempting to load block cidr=192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.824 [INFO][4689] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.824 [INFO][4689] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.85.128/26 handle="k8s-pod-network.09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.826 [INFO][4689] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84 Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.834 [INFO][4689] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.85.128/26 handle="k8s-pod-network.09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.842 [INFO][4689] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.85.133/26] block=192.168.85.128/26 handle="k8s-pod-network.09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.843 [INFO][4689] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.85.133/26] handle="k8s-pod-network.09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.843 [INFO][4689] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 18:00:48.869576 containerd[1658]: 2026-01-16 18:00:48.843 [INFO][4689] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.85.133/26] IPv6=[] ContainerID="09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" HandleID="k8s-pod-network.09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-goldmane--666569f655--j5dxx-eth0" Jan 16 18:00:48.870064 containerd[1658]: 2026-01-16 18:00:48.845 [INFO][4667] cni-plugin/k8s.go 418: Populated endpoint ContainerID="09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" Namespace="calico-system" Pod="goldmane-666569f655-j5dxx" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-goldmane--666569f655--j5dxx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--7f6b5ebc40-k8s-goldmane--666569f655--j5dxx-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"64da405a-12aa-44ec-8b4d-a44866f591ec", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 18, 0, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-7f6b5ebc40", ContainerID:"", Pod:"goldmane-666569f655-j5dxx", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.85.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5d7b09a364e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 18:00:48.870064 containerd[1658]: 2026-01-16 18:00:48.846 [INFO][4667] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.85.133/32] ContainerID="09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" Namespace="calico-system" Pod="goldmane-666569f655-j5dxx" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-goldmane--666569f655--j5dxx-eth0" Jan 16 18:00:48.870064 containerd[1658]: 2026-01-16 18:00:48.846 [INFO][4667] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5d7b09a364e ContainerID="09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" Namespace="calico-system" Pod="goldmane-666569f655-j5dxx" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-goldmane--666569f655--j5dxx-eth0" Jan 16 18:00:48.870064 containerd[1658]: 2026-01-16 18:00:48.854 [INFO][4667] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" Namespace="calico-system" Pod="goldmane-666569f655-j5dxx" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-goldmane--666569f655--j5dxx-eth0" Jan 16 18:00:48.870064 containerd[1658]: 2026-01-16 18:00:48.855 [INFO][4667] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" Namespace="calico-system" Pod="goldmane-666569f655-j5dxx" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-goldmane--666569f655--j5dxx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--7f6b5ebc40-k8s-goldmane--666569f655--j5dxx-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"64da405a-12aa-44ec-8b4d-a44866f591ec", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 18, 0, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-7f6b5ebc40", ContainerID:"09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84", Pod:"goldmane-666569f655-j5dxx", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.85.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5d7b09a364e", MAC:"da:59:4e:08:73:ce", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 18:00:48.870064 containerd[1658]: 2026-01-16 18:00:48.866 [INFO][4667] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" Namespace="calico-system" Pod="goldmane-666569f655-j5dxx" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-goldmane--666569f655--j5dxx-eth0" Jan 16 18:00:48.874408 containerd[1658]: time="2026-01-16T18:00:48.874220843Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-t28mh,Uid:2f2a1d19-8f6a-44c8-adb8-cc33e1ecebce,Namespace:kube-system,Attempt:0,} returns sandbox id \"409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83\"" Jan 16 18:00:48.877021 containerd[1658]: time="2026-01-16T18:00:48.876975212Z" level=info msg="CreateContainer within sandbox \"409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 16 18:00:48.883000 audit[4767]: NETFILTER_CFG table=filter:134 family=2 entries=52 op=nft_register_chain pid=4767 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 18:00:48.883000 audit[4767]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=27540 a0=3 a1=ffffc94012c0 a2=0 a3=ffff8c664fa8 items=0 ppid=4252 pid=4767 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.883000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 18:00:48.891713 containerd[1658]: time="2026-01-16T18:00:48.891669336Z" level=info msg="Container fc973c428fa2393318c7992487cdd7ac4fe26235536556b18e852870ece55135: CDI devices from CRI Config.CDIDevices: []" Jan 16 18:00:48.896009 containerd[1658]: time="2026-01-16T18:00:48.895840309Z" level=info msg="connecting to shim 09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84" address="unix:///run/containerd/s/de6d157a4b352b8462085d305ccbf05f62d0fc0b5a006431cbbc45514c15091b" namespace=k8s.io protocol=ttrpc version=3 Jan 16 18:00:48.899196 containerd[1658]: time="2026-01-16T18:00:48.899157159Z" level=info msg="CreateContainer within sandbox \"409cc3028d40c1e63bf20eff6e16ce36132619cc6839a4341c0677ece9f5ce83\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"fc973c428fa2393318c7992487cdd7ac4fe26235536556b18e852870ece55135\"" Jan 16 18:00:48.899808 containerd[1658]: time="2026-01-16T18:00:48.899779601Z" level=info msg="StartContainer for \"fc973c428fa2393318c7992487cdd7ac4fe26235536556b18e852870ece55135\"" Jan 16 18:00:48.901168 containerd[1658]: time="2026-01-16T18:00:48.901114165Z" level=info msg="connecting to shim fc973c428fa2393318c7992487cdd7ac4fe26235536556b18e852870ece55135" address="unix:///run/containerd/s/0a28c2ccd6846098fac280c320d189a048efb377455dfcf2409007911f5924a7" protocol=ttrpc version=3 Jan 16 18:00:48.921717 systemd[1]: Started cri-containerd-09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84.scope - libcontainer container 09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84. Jan 16 18:00:48.922725 systemd[1]: Started cri-containerd-fc973c428fa2393318c7992487cdd7ac4fe26235536556b18e852870ece55135.scope - libcontainer container fc973c428fa2393318c7992487cdd7ac4fe26235536556b18e852870ece55135. Jan 16 18:00:48.931000 audit: BPF prog-id=231 op=LOAD Jan 16 18:00:48.932000 audit: BPF prog-id=232 op=LOAD Jan 16 18:00:48.932000 audit[4789]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4775 pid=4789 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039613864353266376330393664663235653461316165353933383761 Jan 16 18:00:48.932000 audit: BPF prog-id=232 op=UNLOAD Jan 16 18:00:48.932000 audit[4789]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4775 pid=4789 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039613864353266376330393664663235653461316165353933383761 Jan 16 18:00:48.932000 audit: BPF prog-id=233 op=LOAD Jan 16 18:00:48.932000 audit[4789]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4775 pid=4789 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039613864353266376330393664663235653461316165353933383761 Jan 16 18:00:48.932000 audit: BPF prog-id=234 op=LOAD Jan 16 18:00:48.932000 audit[4789]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4775 pid=4789 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039613864353266376330393664663235653461316165353933383761 Jan 16 18:00:48.932000 audit: BPF prog-id=234 op=UNLOAD Jan 16 18:00:48.932000 audit[4789]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4775 pid=4789 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039613864353266376330393664663235653461316165353933383761 Jan 16 18:00:48.932000 audit: BPF prog-id=233 op=UNLOAD Jan 16 18:00:48.932000 audit[4789]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4775 pid=4789 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039613864353266376330393664663235653461316165353933383761 Jan 16 18:00:48.932000 audit: BPF prog-id=235 op=LOAD Jan 16 18:00:48.932000 audit[4789]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4775 pid=4789 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039613864353266376330393664663235653461316165353933383761 Jan 16 18:00:48.933000 audit: BPF prog-id=236 op=LOAD Jan 16 18:00:48.934000 audit: BPF prog-id=237 op=LOAD Jan 16 18:00:48.934000 audit[4787]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4719 pid=4787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.934000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663393733633432386661323339333331386337393932343837636464 Jan 16 18:00:48.934000 audit: BPF prog-id=237 op=UNLOAD Jan 16 18:00:48.934000 audit[4787]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4719 pid=4787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.934000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663393733633432386661323339333331386337393932343837636464 Jan 16 18:00:48.934000 audit: BPF prog-id=238 op=LOAD Jan 16 18:00:48.934000 audit[4787]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4719 pid=4787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.934000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663393733633432386661323339333331386337393932343837636464 Jan 16 18:00:48.934000 audit: BPF prog-id=239 op=LOAD Jan 16 18:00:48.934000 audit[4787]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4719 pid=4787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.934000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663393733633432386661323339333331386337393932343837636464 Jan 16 18:00:48.934000 audit: BPF prog-id=239 op=UNLOAD Jan 16 18:00:48.934000 audit[4787]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4719 pid=4787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.934000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663393733633432386661323339333331386337393932343837636464 Jan 16 18:00:48.934000 audit: BPF prog-id=238 op=UNLOAD Jan 16 18:00:48.934000 audit[4787]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4719 pid=4787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.934000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663393733633432386661323339333331386337393932343837636464 Jan 16 18:00:48.934000 audit: BPF prog-id=240 op=LOAD Jan 16 18:00:48.934000 audit[4787]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4719 pid=4787 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.934000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663393733633432386661323339333331386337393932343837636464 Jan 16 18:00:48.958446 containerd[1658]: time="2026-01-16T18:00:48.958075777Z" level=info msg="StartContainer for \"fc973c428fa2393318c7992487cdd7ac4fe26235536556b18e852870ece55135\" returns successfully" Jan 16 18:00:48.963201 containerd[1658]: time="2026-01-16T18:00:48.963143433Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-j5dxx,Uid:64da405a-12aa-44ec-8b4d-a44866f591ec,Namespace:calico-system,Attempt:0,} returns sandbox id \"09a8d52f7c096df25e4a1ae59387a61b67c0d255c6ff927b24352ee8b72f4b84\"" Jan 16 18:00:48.964585 containerd[1658]: time="2026-01-16T18:00:48.964551117Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 16 18:00:49.137567 systemd-networkd[1574]: calib0289440d4e: Gained IPv6LL Jan 16 18:00:49.324128 containerd[1658]: time="2026-01-16T18:00:49.324079086Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:49.325202 containerd[1658]: time="2026-01-16T18:00:49.325158930Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 16 18:00:49.325296 containerd[1658]: time="2026-01-16T18:00:49.325230130Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:49.325427 kubelet[2921]: E0116 18:00:49.325383 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 18:00:49.325701 kubelet[2921]: E0116 18:00:49.325450 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 18:00:49.325701 kubelet[2921]: E0116 18:00:49.325570 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbbxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-j5dxx_calico-system(64da405a-12aa-44ec-8b4d-a44866f591ec): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:49.327718 kubelet[2921]: E0116 18:00:49.327672 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:00:49.601715 containerd[1658]: time="2026-01-16T18:00:49.601646807Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cdd49cc55-ql8tz,Uid:3860e0e2-bc4a-40cc-b58d-f0b04ee81f50,Namespace:calico-apiserver,Attempt:0,}" Jan 16 18:00:49.722021 systemd-networkd[1574]: calif37e7237247: Link UP Jan 16 18:00:49.722487 systemd-networkd[1574]: calif37e7237247: Gained carrier Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.647 [INFO][4850] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ql8tz-eth0 calico-apiserver-5cdd49cc55- calico-apiserver 3860e0e2-bc4a-40cc-b58d-f0b04ee81f50 827 0 2026-01-16 18:00:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5cdd49cc55 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4580-0-0-p-7f6b5ebc40 calico-apiserver-5cdd49cc55-ql8tz eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif37e7237247 [] [] }} ContainerID="fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" Namespace="calico-apiserver" Pod="calico-apiserver-5cdd49cc55-ql8tz" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ql8tz-" Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.647 [INFO][4850] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" Namespace="calico-apiserver" Pod="calico-apiserver-5cdd49cc55-ql8tz" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ql8tz-eth0" Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.676 [INFO][4865] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" HandleID="k8s-pod-network.fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ql8tz-eth0" Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.676 [INFO][4865] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" HandleID="k8s-pod-network.fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ql8tz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40004300b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4580-0-0-p-7f6b5ebc40", "pod":"calico-apiserver-5cdd49cc55-ql8tz", "timestamp":"2026-01-16 18:00:49.676398394 +0000 UTC"}, Hostname:"ci-4580-0-0-p-7f6b5ebc40", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.676 [INFO][4865] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.676 [INFO][4865] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.676 [INFO][4865] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-7f6b5ebc40' Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.686 [INFO][4865] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.693 [INFO][4865] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.697 [INFO][4865] ipam/ipam.go 511: Trying affinity for 192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.699 [INFO][4865] ipam/ipam.go 158: Attempting to load block cidr=192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.702 [INFO][4865] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.702 [INFO][4865] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.85.128/26 handle="k8s-pod-network.fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.705 [INFO][4865] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526 Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.709 [INFO][4865] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.85.128/26 handle="k8s-pod-network.fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.716 [INFO][4865] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.85.134/26] block=192.168.85.128/26 handle="k8s-pod-network.fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.716 [INFO][4865] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.85.134/26] handle="k8s-pod-network.fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.716 [INFO][4865] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 18:00:49.735237 containerd[1658]: 2026-01-16 18:00:49.716 [INFO][4865] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.85.134/26] IPv6=[] ContainerID="fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" HandleID="k8s-pod-network.fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ql8tz-eth0" Jan 16 18:00:49.735952 containerd[1658]: 2026-01-16 18:00:49.719 [INFO][4850] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" Namespace="calico-apiserver" Pod="calico-apiserver-5cdd49cc55-ql8tz" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ql8tz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ql8tz-eth0", GenerateName:"calico-apiserver-5cdd49cc55-", Namespace:"calico-apiserver", SelfLink:"", UID:"3860e0e2-bc4a-40cc-b58d-f0b04ee81f50", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 18, 0, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5cdd49cc55", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-7f6b5ebc40", ContainerID:"", Pod:"calico-apiserver-5cdd49cc55-ql8tz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.85.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif37e7237247", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 18:00:49.735952 containerd[1658]: 2026-01-16 18:00:49.719 [INFO][4850] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.85.134/32] ContainerID="fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" Namespace="calico-apiserver" Pod="calico-apiserver-5cdd49cc55-ql8tz" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ql8tz-eth0" Jan 16 18:00:49.735952 containerd[1658]: 2026-01-16 18:00:49.719 [INFO][4850] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif37e7237247 ContainerID="fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" Namespace="calico-apiserver" Pod="calico-apiserver-5cdd49cc55-ql8tz" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ql8tz-eth0" Jan 16 18:00:49.735952 containerd[1658]: 2026-01-16 18:00:49.722 [INFO][4850] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" Namespace="calico-apiserver" Pod="calico-apiserver-5cdd49cc55-ql8tz" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ql8tz-eth0" Jan 16 18:00:49.735952 containerd[1658]: 2026-01-16 18:00:49.722 [INFO][4850] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" Namespace="calico-apiserver" Pod="calico-apiserver-5cdd49cc55-ql8tz" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ql8tz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ql8tz-eth0", GenerateName:"calico-apiserver-5cdd49cc55-", Namespace:"calico-apiserver", SelfLink:"", UID:"3860e0e2-bc4a-40cc-b58d-f0b04ee81f50", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 18, 0, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5cdd49cc55", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-7f6b5ebc40", ContainerID:"fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526", Pod:"calico-apiserver-5cdd49cc55-ql8tz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.85.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif37e7237247", MAC:"22:a4:3e:f3:d0:09", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 18:00:49.735952 containerd[1658]: 2026-01-16 18:00:49.731 [INFO][4850] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" Namespace="calico-apiserver" Pod="calico-apiserver-5cdd49cc55-ql8tz" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--apiserver--5cdd49cc55--ql8tz-eth0" Jan 16 18:00:49.756000 audit[4881]: NETFILTER_CFG table=filter:135 family=2 entries=55 op=nft_register_chain pid=4881 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 18:00:49.756000 audit[4881]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=28288 a0=3 a1=ffffd9b0ca90 a2=0 a3=ffff9bb7ffa8 items=0 ppid=4252 pid=4881 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:49.756000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 18:00:49.760736 kubelet[2921]: E0116 18:00:49.760688 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:00:49.761000 kubelet[2921]: E0116 18:00:49.760716 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:00:49.762089 containerd[1658]: time="2026-01-16T18:00:49.762013333Z" level=info msg="connecting to shim fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526" address="unix:///run/containerd/s/451f58b36d66534e6582f99939428d6dae03f665079884fbe65039c0d1505caa" namespace=k8s.io protocol=ttrpc version=3 Jan 16 18:00:49.801441 kubelet[2921]: I0116 18:00:49.801275 2921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-t28mh" podStartSLOduration=49.801256572 podStartE2EDuration="49.801256572s" podCreationTimestamp="2026-01-16 18:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 18:00:49.785564525 +0000 UTC m=+56.273351812" watchObservedRunningTime="2026-01-16 18:00:49.801256572 +0000 UTC m=+56.289043859" Jan 16 18:00:49.804798 systemd[1]: Started cri-containerd-fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526.scope - libcontainer container fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526. Jan 16 18:00:49.844000 audit[4920]: NETFILTER_CFG table=filter:136 family=2 entries=14 op=nft_register_rule pid=4920 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:49.844000 audit[4920]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffffcc74ba0 a2=0 a3=1 items=0 ppid=3072 pid=4920 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:49.844000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:49.850000 audit: BPF prog-id=241 op=LOAD Jan 16 18:00:49.851000 audit: BPF prog-id=242 op=LOAD Jan 16 18:00:49.851000 audit[4902]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000138180 a2=98 a3=0 items=0 ppid=4891 pid=4902 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:49.851000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663613530366630336634343963613563333262663133326664633963 Jan 16 18:00:49.851000 audit: BPF prog-id=242 op=UNLOAD Jan 16 18:00:49.851000 audit[4902]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4891 pid=4902 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:49.851000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663613530366630336634343963613563333262663133326664633963 Jan 16 18:00:49.851000 audit: BPF prog-id=243 op=LOAD Jan 16 18:00:49.851000 audit[4902]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001383e8 a2=98 a3=0 items=0 ppid=4891 pid=4902 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:49.851000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663613530366630336634343963613563333262663133326664633963 Jan 16 18:00:49.851000 audit: BPF prog-id=244 op=LOAD Jan 16 18:00:49.851000 audit[4902]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000138168 a2=98 a3=0 items=0 ppid=4891 pid=4902 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:49.851000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663613530366630336634343963613563333262663133326664633963 Jan 16 18:00:49.851000 audit[4920]: NETFILTER_CFG table=nat:137 family=2 entries=44 op=nft_register_rule pid=4920 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:49.851000 audit[4920]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=fffffcc74ba0 a2=0 a3=1 items=0 ppid=3072 pid=4920 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:49.851000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:49.851000 audit: BPF prog-id=244 op=UNLOAD Jan 16 18:00:49.851000 audit[4902]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4891 pid=4902 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:49.851000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663613530366630336634343963613563333262663133326664633963 Jan 16 18:00:49.851000 audit: BPF prog-id=243 op=UNLOAD Jan 16 18:00:49.851000 audit[4902]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4891 pid=4902 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:49.851000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663613530366630336634343963613563333262663133326664633963 Jan 16 18:00:49.852000 audit: BPF prog-id=245 op=LOAD Jan 16 18:00:49.852000 audit[4902]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000138648 a2=98 a3=0 items=0 ppid=4891 pid=4902 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:49.852000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663613530366630336634343963613563333262663133326664633963 Jan 16 18:00:49.876387 containerd[1658]: time="2026-01-16T18:00:49.876343600Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5cdd49cc55-ql8tz,Uid:3860e0e2-bc4a-40cc-b58d-f0b04ee81f50,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"fca506f03f449ca5c32bf132fdc9cabd510638f3944df64080b48e9592ed6526\"" Jan 16 18:00:49.877736 containerd[1658]: time="2026-01-16T18:00:49.877711524Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 18:00:49.905730 systemd-networkd[1574]: cali185265902c8: Gained IPv6LL Jan 16 18:00:49.906059 systemd-networkd[1574]: califbc2e79502c: Gained IPv6LL Jan 16 18:00:50.209917 containerd[1658]: time="2026-01-16T18:00:50.209804690Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:50.211757 containerd[1658]: time="2026-01-16T18:00:50.211657896Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 18:00:50.211837 containerd[1658]: time="2026-01-16T18:00:50.211726336Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:50.211888 kubelet[2921]: E0116 18:00:50.211847 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:00:50.211924 kubelet[2921]: E0116 18:00:50.211890 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:00:50.212059 kubelet[2921]: E0116 18:00:50.212011 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzkc8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5cdd49cc55-ql8tz_calico-apiserver(3860e0e2-bc4a-40cc-b58d-f0b04ee81f50): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:50.213455 kubelet[2921]: E0116 18:00:50.213190 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:00:50.602154 containerd[1658]: time="2026-01-16T18:00:50.602009958Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b6469dfdd-hzhpv,Uid:cbb94aae-850a-4651-be6c-0c622d131a34,Namespace:calico-system,Attempt:0,}" Jan 16 18:00:50.708636 systemd-networkd[1574]: cali69f36e807cc: Link UP Jan 16 18:00:50.708776 systemd-networkd[1574]: cali69f36e807cc: Gained carrier Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.644 [INFO][4931] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--7f6b5ebc40-k8s-calico--kube--controllers--7b6469dfdd--hzhpv-eth0 calico-kube-controllers-7b6469dfdd- calico-system cbb94aae-850a-4651-be6c-0c622d131a34 824 0 2026-01-16 18:00:19 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7b6469dfdd projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4580-0-0-p-7f6b5ebc40 calico-kube-controllers-7b6469dfdd-hzhpv eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali69f36e807cc [] [] }} ContainerID="5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" Namespace="calico-system" Pod="calico-kube-controllers-7b6469dfdd-hzhpv" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--kube--controllers--7b6469dfdd--hzhpv-" Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.644 [INFO][4931] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" Namespace="calico-system" Pod="calico-kube-controllers-7b6469dfdd-hzhpv" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--kube--controllers--7b6469dfdd--hzhpv-eth0" Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.668 [INFO][4946] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" HandleID="k8s-pod-network.5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--kube--controllers--7b6469dfdd--hzhpv-eth0" Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.668 [INFO][4946] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" HandleID="k8s-pod-network.5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--kube--controllers--7b6469dfdd--hzhpv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40005020e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4580-0-0-p-7f6b5ebc40", "pod":"calico-kube-controllers-7b6469dfdd-hzhpv", "timestamp":"2026-01-16 18:00:50.668403039 +0000 UTC"}, Hostname:"ci-4580-0-0-p-7f6b5ebc40", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.668 [INFO][4946] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.668 [INFO][4946] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.668 [INFO][4946] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-7f6b5ebc40' Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.678 [INFO][4946] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.682 [INFO][4946] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.687 [INFO][4946] ipam/ipam.go 511: Trying affinity for 192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.689 [INFO][4946] ipam/ipam.go 158: Attempting to load block cidr=192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.691 [INFO][4946] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.691 [INFO][4946] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.85.128/26 handle="k8s-pod-network.5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.693 [INFO][4946] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4 Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.697 [INFO][4946] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.85.128/26 handle="k8s-pod-network.5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.704 [INFO][4946] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.85.135/26] block=192.168.85.128/26 handle="k8s-pod-network.5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.704 [INFO][4946] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.85.135/26] handle="k8s-pod-network.5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.704 [INFO][4946] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 18:00:50.721700 containerd[1658]: 2026-01-16 18:00:50.705 [INFO][4946] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.85.135/26] IPv6=[] ContainerID="5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" HandleID="k8s-pod-network.5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--kube--controllers--7b6469dfdd--hzhpv-eth0" Jan 16 18:00:50.722384 containerd[1658]: 2026-01-16 18:00:50.706 [INFO][4931] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" Namespace="calico-system" Pod="calico-kube-controllers-7b6469dfdd-hzhpv" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--kube--controllers--7b6469dfdd--hzhpv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--7f6b5ebc40-k8s-calico--kube--controllers--7b6469dfdd--hzhpv-eth0", GenerateName:"calico-kube-controllers-7b6469dfdd-", Namespace:"calico-system", SelfLink:"", UID:"cbb94aae-850a-4651-be6c-0c622d131a34", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 18, 0, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7b6469dfdd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-7f6b5ebc40", ContainerID:"", Pod:"calico-kube-controllers-7b6469dfdd-hzhpv", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.85.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali69f36e807cc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 18:00:50.722384 containerd[1658]: 2026-01-16 18:00:50.706 [INFO][4931] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.85.135/32] ContainerID="5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" Namespace="calico-system" Pod="calico-kube-controllers-7b6469dfdd-hzhpv" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--kube--controllers--7b6469dfdd--hzhpv-eth0" Jan 16 18:00:50.722384 containerd[1658]: 2026-01-16 18:00:50.706 [INFO][4931] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali69f36e807cc ContainerID="5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" Namespace="calico-system" Pod="calico-kube-controllers-7b6469dfdd-hzhpv" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--kube--controllers--7b6469dfdd--hzhpv-eth0" Jan 16 18:00:50.722384 containerd[1658]: 2026-01-16 18:00:50.709 [INFO][4931] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" Namespace="calico-system" Pod="calico-kube-controllers-7b6469dfdd-hzhpv" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--kube--controllers--7b6469dfdd--hzhpv-eth0" Jan 16 18:00:50.722384 containerd[1658]: 2026-01-16 18:00:50.709 [INFO][4931] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" Namespace="calico-system" Pod="calico-kube-controllers-7b6469dfdd-hzhpv" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--kube--controllers--7b6469dfdd--hzhpv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--7f6b5ebc40-k8s-calico--kube--controllers--7b6469dfdd--hzhpv-eth0", GenerateName:"calico-kube-controllers-7b6469dfdd-", Namespace:"calico-system", SelfLink:"", UID:"cbb94aae-850a-4651-be6c-0c622d131a34", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 18, 0, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7b6469dfdd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-7f6b5ebc40", ContainerID:"5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4", Pod:"calico-kube-controllers-7b6469dfdd-hzhpv", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.85.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali69f36e807cc", MAC:"62:97:79:1c:85:2b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 18:00:50.722384 containerd[1658]: 2026-01-16 18:00:50.718 [INFO][4931] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" Namespace="calico-system" Pod="calico-kube-controllers-7b6469dfdd-hzhpv" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-calico--kube--controllers--7b6469dfdd--hzhpv-eth0" Jan 16 18:00:50.731000 audit[4961]: NETFILTER_CFG table=filter:138 family=2 entries=48 op=nft_register_chain pid=4961 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 18:00:50.731000 audit[4961]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=23108 a0=3 a1=ffffea6d58f0 a2=0 a3=ffff8f1aafa8 items=0 ppid=4252 pid=4961 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:50.731000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 18:00:50.763055 kubelet[2921]: E0116 18:00:50.762914 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:00:50.763684 kubelet[2921]: E0116 18:00:50.763470 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:00:50.771881 containerd[1658]: time="2026-01-16T18:00:50.771517632Z" level=info msg="connecting to shim 5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4" address="unix:///run/containerd/s/4adcb1b5aff8e4d2b65a63feee23a1acc62dac20b69d8d4ab5a0c56c37d85f54" namespace=k8s.io protocol=ttrpc version=3 Jan 16 18:00:50.796000 audit[4996]: NETFILTER_CFG table=filter:139 family=2 entries=14 op=nft_register_rule pid=4996 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:50.796000 audit[4996]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffd0ef4310 a2=0 a3=1 items=0 ppid=3072 pid=4996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:50.796000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:50.801546 systemd-networkd[1574]: cali5d7b09a364e: Gained IPv6LL Jan 16 18:00:50.803973 systemd[1]: Started cri-containerd-5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4.scope - libcontainer container 5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4. Jan 16 18:00:50.814000 audit[4996]: NETFILTER_CFG table=nat:140 family=2 entries=56 op=nft_register_chain pid=4996 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:00:50.814000 audit[4996]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=ffffd0ef4310 a2=0 a3=1 items=0 ppid=3072 pid=4996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:50.814000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:00:50.826000 audit: BPF prog-id=246 op=LOAD Jan 16 18:00:50.826000 audit: BPF prog-id=247 op=LOAD Jan 16 18:00:50.826000 audit[4983]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000186180 a2=98 a3=0 items=0 ppid=4971 pid=4983 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:50.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564663364396634366439646131653363626432346432393032383430 Jan 16 18:00:50.826000 audit: BPF prog-id=247 op=UNLOAD Jan 16 18:00:50.826000 audit[4983]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4971 pid=4983 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:50.826000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564663364396634366439646131653363626432346432393032383430 Jan 16 18:00:50.827000 audit: BPF prog-id=248 op=LOAD Jan 16 18:00:50.827000 audit[4983]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001863e8 a2=98 a3=0 items=0 ppid=4971 pid=4983 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:50.827000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564663364396634366439646131653363626432346432393032383430 Jan 16 18:00:50.827000 audit: BPF prog-id=249 op=LOAD Jan 16 18:00:50.827000 audit[4983]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000186168 a2=98 a3=0 items=0 ppid=4971 pid=4983 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:50.827000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564663364396634366439646131653363626432346432393032383430 Jan 16 18:00:50.827000 audit: BPF prog-id=249 op=UNLOAD Jan 16 18:00:50.827000 audit[4983]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4971 pid=4983 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:50.827000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564663364396634366439646131653363626432346432393032383430 Jan 16 18:00:50.827000 audit: BPF prog-id=248 op=UNLOAD Jan 16 18:00:50.827000 audit[4983]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4971 pid=4983 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:50.827000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564663364396634366439646131653363626432346432393032383430 Jan 16 18:00:50.827000 audit: BPF prog-id=250 op=LOAD Jan 16 18:00:50.827000 audit[4983]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000186648 a2=98 a3=0 items=0 ppid=4971 pid=4983 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:50.827000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3564663364396634366439646131653363626432346432393032383430 Jan 16 18:00:50.851506 containerd[1658]: time="2026-01-16T18:00:50.851473474Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7b6469dfdd-hzhpv,Uid:cbb94aae-850a-4651-be6c-0c622d131a34,Namespace:calico-system,Attempt:0,} returns sandbox id \"5df3d9f46d9da1e3cbd24d2902840d1363ec2a3b9d4ce17bddb893895beaabc4\"" Jan 16 18:00:50.853694 containerd[1658]: time="2026-01-16T18:00:50.853606761Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 16 18:00:51.198476 containerd[1658]: time="2026-01-16T18:00:51.198282725Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:51.200864 containerd[1658]: time="2026-01-16T18:00:51.200764252Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 16 18:00:51.200864 containerd[1658]: time="2026-01-16T18:00:51.200814253Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:51.201035 kubelet[2921]: E0116 18:00:51.200997 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 18:00:51.201089 kubelet[2921]: E0116 18:00:51.201042 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 18:00:51.201207 kubelet[2921]: E0116 18:00:51.201159 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-84qv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7b6469dfdd-hzhpv_calico-system(cbb94aae-850a-4651-be6c-0c622d131a34): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:51.202492 kubelet[2921]: E0116 18:00:51.202444 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:00:51.603608 containerd[1658]: time="2026-01-16T18:00:51.602568670Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rb578,Uid:08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3,Namespace:calico-system,Attempt:0,}" Jan 16 18:00:51.697597 systemd-networkd[1574]: calif37e7237247: Gained IPv6LL Jan 16 18:00:51.721599 systemd-networkd[1574]: calib475d680b24: Link UP Jan 16 18:00:51.721865 systemd-networkd[1574]: calib475d680b24: Gained carrier Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.654 [INFO][5018] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--7f6b5ebc40-k8s-csi--node--driver--rb578-eth0 csi-node-driver- calico-system 08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3 707 0 2026-01-16 18:00:19 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4580-0-0-p-7f6b5ebc40 csi-node-driver-rb578 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calib475d680b24 [] [] }} ContainerID="59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" Namespace="calico-system" Pod="csi-node-driver-rb578" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-csi--node--driver--rb578-" Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.654 [INFO][5018] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" Namespace="calico-system" Pod="csi-node-driver-rb578" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-csi--node--driver--rb578-eth0" Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.676 [INFO][5033] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" HandleID="k8s-pod-network.59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-csi--node--driver--rb578-eth0" Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.676 [INFO][5033] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" HandleID="k8s-pod-network.59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-csi--node--driver--rb578-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d460), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4580-0-0-p-7f6b5ebc40", "pod":"csi-node-driver-rb578", "timestamp":"2026-01-16 18:00:51.676346533 +0000 UTC"}, Hostname:"ci-4580-0-0-p-7f6b5ebc40", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.676 [INFO][5033] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.676 [INFO][5033] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.676 [INFO][5033] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-7f6b5ebc40' Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.686 [INFO][5033] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.691 [INFO][5033] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.696 [INFO][5033] ipam/ipam.go 511: Trying affinity for 192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.699 [INFO][5033] ipam/ipam.go 158: Attempting to load block cidr=192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.702 [INFO][5033] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.85.128/26 host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.702 [INFO][5033] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.85.128/26 handle="k8s-pod-network.59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.705 [INFO][5033] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843 Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.710 [INFO][5033] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.85.128/26 handle="k8s-pod-network.59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.717 [INFO][5033] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.85.136/26] block=192.168.85.128/26 handle="k8s-pod-network.59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.717 [INFO][5033] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.85.136/26] handle="k8s-pod-network.59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" host="ci-4580-0-0-p-7f6b5ebc40" Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.717 [INFO][5033] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 18:00:51.735468 containerd[1658]: 2026-01-16 18:00:51.717 [INFO][5033] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.85.136/26] IPv6=[] ContainerID="59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" HandleID="k8s-pod-network.59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" Workload="ci--4580--0--0--p--7f6b5ebc40-k8s-csi--node--driver--rb578-eth0" Jan 16 18:00:51.736776 containerd[1658]: 2026-01-16 18:00:51.719 [INFO][5018] cni-plugin/k8s.go 418: Populated endpoint ContainerID="59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" Namespace="calico-system" Pod="csi-node-driver-rb578" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-csi--node--driver--rb578-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--7f6b5ebc40-k8s-csi--node--driver--rb578-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 18, 0, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-7f6b5ebc40", ContainerID:"", Pod:"csi-node-driver-rb578", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.85.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib475d680b24", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 18:00:51.736776 containerd[1658]: 2026-01-16 18:00:51.719 [INFO][5018] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.85.136/32] ContainerID="59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" Namespace="calico-system" Pod="csi-node-driver-rb578" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-csi--node--driver--rb578-eth0" Jan 16 18:00:51.736776 containerd[1658]: 2026-01-16 18:00:51.719 [INFO][5018] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib475d680b24 ContainerID="59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" Namespace="calico-system" Pod="csi-node-driver-rb578" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-csi--node--driver--rb578-eth0" Jan 16 18:00:51.736776 containerd[1658]: 2026-01-16 18:00:51.722 [INFO][5018] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" Namespace="calico-system" Pod="csi-node-driver-rb578" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-csi--node--driver--rb578-eth0" Jan 16 18:00:51.736776 containerd[1658]: 2026-01-16 18:00:51.722 [INFO][5018] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" Namespace="calico-system" Pod="csi-node-driver-rb578" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-csi--node--driver--rb578-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--7f6b5ebc40-k8s-csi--node--driver--rb578-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 18, 0, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-7f6b5ebc40", ContainerID:"59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843", Pod:"csi-node-driver-rb578", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.85.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib475d680b24", MAC:"ae:25:3c:05:e2:b7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 18:00:51.736776 containerd[1658]: 2026-01-16 18:00:51.732 [INFO][5018] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" Namespace="calico-system" Pod="csi-node-driver-rb578" WorkloadEndpoint="ci--4580--0--0--p--7f6b5ebc40-k8s-csi--node--driver--rb578-eth0" Jan 16 18:00:51.746000 audit[5050]: NETFILTER_CFG table=filter:141 family=2 entries=40 op=nft_register_chain pid=5050 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 18:00:51.746000 audit[5050]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=20784 a0=3 a1=ffffff5f88d0 a2=0 a3=ffff7f7d6fa8 items=0 ppid=4252 pid=5050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:51.746000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 18:00:51.766577 kubelet[2921]: E0116 18:00:51.766517 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:00:51.766577 kubelet[2921]: E0116 18:00:51.766572 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:00:51.797949 containerd[1658]: time="2026-01-16T18:00:51.797903902Z" level=info msg="connecting to shim 59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843" address="unix:///run/containerd/s/ab0654ebdced63478517028fad8eb7806b7d86d7204d801d689575dd091f1ae5" namespace=k8s.io protocol=ttrpc version=3 Jan 16 18:00:51.826672 systemd[1]: Started cri-containerd-59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843.scope - libcontainer container 59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843. Jan 16 18:00:51.835000 audit: BPF prog-id=251 op=LOAD Jan 16 18:00:51.836000 audit: BPF prog-id=252 op=LOAD Jan 16 18:00:51.836000 audit[5070]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=5060 pid=5070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:51.836000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539616364303331646165373765333966636234323334636539666335 Jan 16 18:00:51.836000 audit: BPF prog-id=252 op=UNLOAD Jan 16 18:00:51.836000 audit[5070]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5060 pid=5070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:51.836000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539616364303331646165373765333966636234323334636539666335 Jan 16 18:00:51.836000 audit: BPF prog-id=253 op=LOAD Jan 16 18:00:51.836000 audit[5070]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=5060 pid=5070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:51.836000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539616364303331646165373765333966636234323334636539666335 Jan 16 18:00:51.836000 audit: BPF prog-id=254 op=LOAD Jan 16 18:00:51.836000 audit[5070]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=5060 pid=5070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:51.836000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539616364303331646165373765333966636234323334636539666335 Jan 16 18:00:51.836000 audit: BPF prog-id=254 op=UNLOAD Jan 16 18:00:51.836000 audit[5070]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5060 pid=5070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:51.836000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539616364303331646165373765333966636234323334636539666335 Jan 16 18:00:51.837000 audit: BPF prog-id=253 op=UNLOAD Jan 16 18:00:51.837000 audit[5070]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5060 pid=5070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:51.837000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539616364303331646165373765333966636234323334636539666335 Jan 16 18:00:51.837000 audit: BPF prog-id=255 op=LOAD Jan 16 18:00:51.837000 audit[5070]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=5060 pid=5070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:51.837000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539616364303331646165373765333966636234323334636539666335 Jan 16 18:00:51.852129 containerd[1658]: time="2026-01-16T18:00:51.852040466Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rb578,Uid:08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3,Namespace:calico-system,Attempt:0,} returns sandbox id \"59acd031dae77e39fcb4234ce9fc592fb6201c411367ef6a63f837db84630843\"" Jan 16 18:00:51.853693 containerd[1658]: time="2026-01-16T18:00:51.853605710Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 16 18:00:52.186740 containerd[1658]: time="2026-01-16T18:00:52.186644519Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:52.188190 containerd[1658]: time="2026-01-16T18:00:52.188154964Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 16 18:00:52.188256 containerd[1658]: time="2026-01-16T18:00:52.188196764Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:52.188451 kubelet[2921]: E0116 18:00:52.188366 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 18:00:52.188515 kubelet[2921]: E0116 18:00:52.188464 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 18:00:52.188655 kubelet[2921]: E0116 18:00:52.188603 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-csxnn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rb578_calico-system(08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:52.190649 containerd[1658]: time="2026-01-16T18:00:52.190601731Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 16 18:00:52.531066 systemd-networkd[1574]: cali69f36e807cc: Gained IPv6LL Jan 16 18:00:52.769559 kubelet[2921]: E0116 18:00:52.769259 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:00:52.816272 containerd[1658]: time="2026-01-16T18:00:52.816163547Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:52.818914 containerd[1658]: time="2026-01-16T18:00:52.818869955Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 16 18:00:52.818983 containerd[1658]: time="2026-01-16T18:00:52.818928315Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:52.819143 kubelet[2921]: E0116 18:00:52.819106 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 18:00:52.819225 kubelet[2921]: E0116 18:00:52.819155 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 18:00:52.819352 kubelet[2921]: E0116 18:00:52.819273 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-csxnn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rb578_calico-system(08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:52.820501 kubelet[2921]: E0116 18:00:52.820466 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:00:53.041620 systemd-networkd[1574]: calib475d680b24: Gained IPv6LL Jan 16 18:00:53.770496 kubelet[2921]: E0116 18:00:53.770446 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:00:55.604405 containerd[1658]: time="2026-01-16T18:00:55.604225554Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 16 18:00:55.940060 containerd[1658]: time="2026-01-16T18:00:55.939945971Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:55.946966 containerd[1658]: time="2026-01-16T18:00:55.946881592Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 16 18:00:55.947070 containerd[1658]: time="2026-01-16T18:00:55.946940432Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:55.947314 kubelet[2921]: E0116 18:00:55.947243 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 18:00:55.947314 kubelet[2921]: E0116 18:00:55.947292 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 18:00:55.947803 kubelet[2921]: E0116 18:00:55.947458 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0b1f42d8785e424d85c2c2149fa8b5bd,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6vqqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-55c9fc4598-2nzbk_calico-system(19ae2064-76f2-47ed-9968-9de33cfc7702): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:55.949509 containerd[1658]: time="2026-01-16T18:00:55.949480200Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 16 18:00:56.265841 containerd[1658]: time="2026-01-16T18:00:56.265611198Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:56.267479 containerd[1658]: time="2026-01-16T18:00:56.267415203Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 16 18:00:56.267537 containerd[1658]: time="2026-01-16T18:00:56.267512004Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:56.267797 kubelet[2921]: E0116 18:00:56.267705 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 18:00:56.267797 kubelet[2921]: E0116 18:00:56.267758 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 18:00:56.268257 kubelet[2921]: E0116 18:00:56.267910 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6vqqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-55c9fc4598-2nzbk_calico-system(19ae2064-76f2-47ed-9968-9de33cfc7702): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:56.269413 kubelet[2921]: E0116 18:00:56.269380 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-55c9fc4598-2nzbk" podUID="19ae2064-76f2-47ed-9968-9de33cfc7702" Jan 16 18:01:02.602500 containerd[1658]: time="2026-01-16T18:01:02.602458197Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 18:01:02.950871 containerd[1658]: time="2026-01-16T18:01:02.950819253Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:01:02.953635 containerd[1658]: time="2026-01-16T18:01:02.953540461Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 18:01:02.953635 containerd[1658]: time="2026-01-16T18:01:02.953585701Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 18:01:02.953766 kubelet[2921]: E0116 18:01:02.953718 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:01:02.953766 kubelet[2921]: E0116 18:01:02.953761 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:01:02.954032 kubelet[2921]: E0116 18:01:02.953875 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdp42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5cdd49cc55-ck2gc_calico-apiserver(2d37882f-c058-4d23-87c0-40e1fbaf0de7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 18:01:02.955074 kubelet[2921]: E0116 18:01:02.955039 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:01:03.603163 containerd[1658]: time="2026-01-16T18:01:03.602866748Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 18:01:03.940149 containerd[1658]: time="2026-01-16T18:01:03.940077130Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:01:03.942495 containerd[1658]: time="2026-01-16T18:01:03.942456017Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 18:01:03.942646 containerd[1658]: time="2026-01-16T18:01:03.942540937Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 18:01:03.942697 kubelet[2921]: E0116 18:01:03.942654 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:01:03.942738 kubelet[2921]: E0116 18:01:03.942699 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:01:03.943560 kubelet[2921]: E0116 18:01:03.942914 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzkc8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5cdd49cc55-ql8tz_calico-apiserver(3860e0e2-bc4a-40cc-b58d-f0b04ee81f50): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 18:01:03.943700 containerd[1658]: time="2026-01-16T18:01:03.943342060Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 16 18:01:03.944950 kubelet[2921]: E0116 18:01:03.944912 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:01:04.279511 containerd[1658]: time="2026-01-16T18:01:04.279361518Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:01:04.281607 containerd[1658]: time="2026-01-16T18:01:04.281552725Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 16 18:01:04.281665 containerd[1658]: time="2026-01-16T18:01:04.281591925Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 16 18:01:04.281965 kubelet[2921]: E0116 18:01:04.281822 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 18:01:04.281965 kubelet[2921]: E0116 18:01:04.281944 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 18:01:04.282677 kubelet[2921]: E0116 18:01:04.282604 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbbxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-j5dxx_calico-system(64da405a-12aa-44ec-8b4d-a44866f591ec): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 16 18:01:04.283923 kubelet[2921]: E0116 18:01:04.283811 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:01:05.604699 containerd[1658]: time="2026-01-16T18:01:05.604436693Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 16 18:01:05.935327 containerd[1658]: time="2026-01-16T18:01:05.935225295Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:01:05.937418 containerd[1658]: time="2026-01-16T18:01:05.937322581Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 16 18:01:05.937418 containerd[1658]: time="2026-01-16T18:01:05.937371781Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 16 18:01:05.937606 kubelet[2921]: E0116 18:01:05.937545 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 18:01:05.937606 kubelet[2921]: E0116 18:01:05.937602 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 18:01:05.938062 kubelet[2921]: E0116 18:01:05.937714 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-84qv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7b6469dfdd-hzhpv_calico-system(cbb94aae-850a-4651-be6c-0c622d131a34): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 16 18:01:05.939026 kubelet[2921]: E0116 18:01:05.938938 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:01:07.603631 containerd[1658]: time="2026-01-16T18:01:07.603479629Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 16 18:01:07.948890 containerd[1658]: time="2026-01-16T18:01:07.948849116Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:01:07.950511 containerd[1658]: time="2026-01-16T18:01:07.950479401Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 16 18:01:07.950644 containerd[1658]: time="2026-01-16T18:01:07.950555401Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 16 18:01:07.950809 kubelet[2921]: E0116 18:01:07.950774 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 18:01:07.951302 kubelet[2921]: E0116 18:01:07.951118 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 18:01:07.951302 kubelet[2921]: E0116 18:01:07.951244 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-csxnn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rb578_calico-system(08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 16 18:01:07.953084 containerd[1658]: time="2026-01-16T18:01:07.953058009Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 16 18:01:08.307872 containerd[1658]: time="2026-01-16T18:01:08.307650763Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:01:08.309935 containerd[1658]: time="2026-01-16T18:01:08.309882090Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 16 18:01:08.310026 containerd[1658]: time="2026-01-16T18:01:08.309967210Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 16 18:01:08.310204 kubelet[2921]: E0116 18:01:08.310170 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 18:01:08.310300 kubelet[2921]: E0116 18:01:08.310280 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 18:01:08.310533 kubelet[2921]: E0116 18:01:08.310494 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-csxnn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rb578_calico-system(08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 16 18:01:08.311959 kubelet[2921]: E0116 18:01:08.311900 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:01:08.602087 kubelet[2921]: E0116 18:01:08.601965 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-55c9fc4598-2nzbk" podUID="19ae2064-76f2-47ed-9968-9de33cfc7702" Jan 16 18:01:15.602338 kubelet[2921]: E0116 18:01:15.602115 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:01:15.603372 kubelet[2921]: E0116 18:01:15.602951 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:01:16.601929 kubelet[2921]: E0116 18:01:16.601814 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:01:18.602522 kubelet[2921]: E0116 18:01:18.602263 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:01:19.604957 kubelet[2921]: E0116 18:01:19.604887 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:01:21.602718 containerd[1658]: time="2026-01-16T18:01:21.602599004Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 16 18:01:21.937201 containerd[1658]: time="2026-01-16T18:01:21.936595176Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:01:21.937998 containerd[1658]: time="2026-01-16T18:01:21.937927980Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 16 18:01:21.937998 containerd[1658]: time="2026-01-16T18:01:21.937964740Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 16 18:01:21.938739 kubelet[2921]: E0116 18:01:21.938638 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 18:01:21.938739 kubelet[2921]: E0116 18:01:21.938689 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 18:01:21.939443 kubelet[2921]: E0116 18:01:21.938830 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0b1f42d8785e424d85c2c2149fa8b5bd,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6vqqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-55c9fc4598-2nzbk_calico-system(19ae2064-76f2-47ed-9968-9de33cfc7702): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 16 18:01:21.941440 containerd[1658]: time="2026-01-16T18:01:21.941394551Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 16 18:01:22.312925 containerd[1658]: time="2026-01-16T18:01:22.312190434Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:01:22.313835 containerd[1658]: time="2026-01-16T18:01:22.313760119Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 16 18:01:22.313958 containerd[1658]: time="2026-01-16T18:01:22.313786079Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 16 18:01:22.314169 kubelet[2921]: E0116 18:01:22.314073 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 18:01:22.314219 kubelet[2921]: E0116 18:01:22.314178 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 18:01:22.314483 kubelet[2921]: E0116 18:01:22.314439 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6vqqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-55c9fc4598-2nzbk_calico-system(19ae2064-76f2-47ed-9968-9de33cfc7702): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 16 18:01:22.315946 kubelet[2921]: E0116 18:01:22.315896 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-55c9fc4598-2nzbk" podUID="19ae2064-76f2-47ed-9968-9de33cfc7702" Jan 16 18:01:27.602697 containerd[1658]: time="2026-01-16T18:01:27.602647183Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 16 18:01:27.957508 containerd[1658]: time="2026-01-16T18:01:27.957201337Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:01:27.959071 containerd[1658]: time="2026-01-16T18:01:27.959030023Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 16 18:01:27.959136 containerd[1658]: time="2026-01-16T18:01:27.959086743Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 16 18:01:27.959291 kubelet[2921]: E0116 18:01:27.959253 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 18:01:27.959610 kubelet[2921]: E0116 18:01:27.959304 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 18:01:27.959610 kubelet[2921]: E0116 18:01:27.959441 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbbxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-j5dxx_calico-system(64da405a-12aa-44ec-8b4d-a44866f591ec): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 16 18:01:27.960692 kubelet[2921]: E0116 18:01:27.960654 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:01:29.603567 containerd[1658]: time="2026-01-16T18:01:29.603401525Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 18:01:29.963765 containerd[1658]: time="2026-01-16T18:01:29.963663576Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:01:29.965739 containerd[1658]: time="2026-01-16T18:01:29.965634942Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 18:01:29.965739 containerd[1658]: time="2026-01-16T18:01:29.965670823Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 18:01:29.967682 kubelet[2921]: E0116 18:01:29.967544 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:01:29.967682 kubelet[2921]: E0116 18:01:29.967610 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:01:29.968307 kubelet[2921]: E0116 18:01:29.968176 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdp42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5cdd49cc55-ck2gc_calico-apiserver(2d37882f-c058-4d23-87c0-40e1fbaf0de7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 18:01:29.968402 containerd[1658]: time="2026-01-16T18:01:29.967959949Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 16 18:01:29.969386 kubelet[2921]: E0116 18:01:29.969339 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:01:30.320813 containerd[1658]: time="2026-01-16T18:01:30.320567698Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:01:30.321902 containerd[1658]: time="2026-01-16T18:01:30.321863182Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 16 18:01:30.322675 containerd[1658]: time="2026-01-16T18:01:30.321893902Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 16 18:01:30.322901 kubelet[2921]: E0116 18:01:30.322862 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 18:01:30.322954 kubelet[2921]: E0116 18:01:30.322913 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 18:01:30.323074 kubelet[2921]: E0116 18:01:30.323024 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-84qv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7b6469dfdd-hzhpv_calico-system(cbb94aae-850a-4651-be6c-0c622d131a34): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 16 18:01:30.324314 kubelet[2921]: E0116 18:01:30.324276 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:01:30.602508 containerd[1658]: time="2026-01-16T18:01:30.602134231Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 18:01:30.940018 containerd[1658]: time="2026-01-16T18:01:30.939920694Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:01:30.941253 containerd[1658]: time="2026-01-16T18:01:30.941188298Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 18:01:30.941354 containerd[1658]: time="2026-01-16T18:01:30.941304938Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 18:01:30.941565 kubelet[2921]: E0116 18:01:30.941521 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:01:30.941618 kubelet[2921]: E0116 18:01:30.941578 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:01:30.941828 kubelet[2921]: E0116 18:01:30.941744 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzkc8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5cdd49cc55-ql8tz_calico-apiserver(3860e0e2-bc4a-40cc-b58d-f0b04ee81f50): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 18:01:30.942975 kubelet[2921]: E0116 18:01:30.942934 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:01:33.605329 containerd[1658]: time="2026-01-16T18:01:33.604608688Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 16 18:01:33.929064 containerd[1658]: time="2026-01-16T18:01:33.928850070Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:01:33.930385 containerd[1658]: time="2026-01-16T18:01:33.930344115Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 16 18:01:33.930501 containerd[1658]: time="2026-01-16T18:01:33.930365075Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 16 18:01:33.930602 kubelet[2921]: E0116 18:01:33.930561 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 18:01:33.930965 kubelet[2921]: E0116 18:01:33.930617 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 18:01:33.931578 kubelet[2921]: E0116 18:01:33.931498 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-csxnn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rb578_calico-system(08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 16 18:01:33.933517 containerd[1658]: time="2026-01-16T18:01:33.933490564Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 16 18:01:34.298842 containerd[1658]: time="2026-01-16T18:01:34.298212309Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:01:34.300201 containerd[1658]: time="2026-01-16T18:01:34.300135435Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 16 18:01:34.300271 containerd[1658]: time="2026-01-16T18:01:34.300222435Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 16 18:01:34.300433 kubelet[2921]: E0116 18:01:34.300359 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 18:01:34.300489 kubelet[2921]: E0116 18:01:34.300451 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 18:01:34.300685 kubelet[2921]: E0116 18:01:34.300587 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-csxnn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rb578_calico-system(08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 16 18:01:34.301976 kubelet[2921]: E0116 18:01:34.301927 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:01:35.603416 kubelet[2921]: E0116 18:01:35.603335 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-55c9fc4598-2nzbk" podUID="19ae2064-76f2-47ed-9968-9de33cfc7702" Jan 16 18:01:40.602490 kubelet[2921]: E0116 18:01:40.602439 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:01:41.601987 kubelet[2921]: E0116 18:01:41.601928 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:01:44.603104 kubelet[2921]: E0116 18:01:44.602954 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:01:45.603084 kubelet[2921]: E0116 18:01:45.602824 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:01:46.604458 kubelet[2921]: E0116 18:01:46.604089 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:01:47.602790 kubelet[2921]: E0116 18:01:47.602744 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-55c9fc4598-2nzbk" podUID="19ae2064-76f2-47ed-9968-9de33cfc7702" Jan 16 18:01:52.604297 kubelet[2921]: E0116 18:01:52.604254 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:01:56.602452 kubelet[2921]: E0116 18:01:56.602356 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:01:56.603251 kubelet[2921]: E0116 18:01:56.602788 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:01:58.603325 kubelet[2921]: E0116 18:01:58.602295 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:01:59.602787 kubelet[2921]: E0116 18:01:59.602719 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-55c9fc4598-2nzbk" podUID="19ae2064-76f2-47ed-9968-9de33cfc7702" Jan 16 18:02:00.604096 kubelet[2921]: E0116 18:02:00.604029 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:02:06.602533 kubelet[2921]: E0116 18:02:06.602282 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:02:09.602874 kubelet[2921]: E0116 18:02:09.602701 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:02:10.601741 kubelet[2921]: E0116 18:02:10.601688 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:02:10.603071 containerd[1658]: time="2026-01-16T18:02:10.603035266Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 16 18:02:10.953601 containerd[1658]: time="2026-01-16T18:02:10.953544728Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:02:10.956483 containerd[1658]: time="2026-01-16T18:02:10.956437177Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 16 18:02:10.956606 containerd[1658]: time="2026-01-16T18:02:10.956520137Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 16 18:02:10.956937 kubelet[2921]: E0116 18:02:10.956768 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 18:02:10.956937 kubelet[2921]: E0116 18:02:10.956838 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 18:02:10.958053 kubelet[2921]: E0116 18:02:10.957755 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbbxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-j5dxx_calico-system(64da405a-12aa-44ec-8b4d-a44866f591ec): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 16 18:02:10.959359 kubelet[2921]: E0116 18:02:10.959328 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:02:11.603172 kubelet[2921]: E0116 18:02:11.603033 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:02:12.603112 containerd[1658]: time="2026-01-16T18:02:12.602648164Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 16 18:02:12.971136 containerd[1658]: time="2026-01-16T18:02:12.970872920Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:02:12.973438 containerd[1658]: time="2026-01-16T18:02:12.973370088Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 16 18:02:12.973513 containerd[1658]: time="2026-01-16T18:02:12.973455288Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 16 18:02:12.973713 kubelet[2921]: E0116 18:02:12.973652 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 18:02:12.974015 kubelet[2921]: E0116 18:02:12.973775 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 18:02:12.974106 kubelet[2921]: E0116 18:02:12.974001 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0b1f42d8785e424d85c2c2149fa8b5bd,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6vqqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-55c9fc4598-2nzbk_calico-system(19ae2064-76f2-47ed-9968-9de33cfc7702): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 16 18:02:12.976148 containerd[1658]: time="2026-01-16T18:02:12.976069256Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 16 18:02:13.319001 containerd[1658]: time="2026-01-16T18:02:13.318773974Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:02:13.320574 containerd[1658]: time="2026-01-16T18:02:13.320472059Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 16 18:02:13.320574 containerd[1658]: time="2026-01-16T18:02:13.320496299Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 16 18:02:13.320882 kubelet[2921]: E0116 18:02:13.320801 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 18:02:13.320930 kubelet[2921]: E0116 18:02:13.320887 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 18:02:13.321029 kubelet[2921]: E0116 18:02:13.320991 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6vqqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-55c9fc4598-2nzbk_calico-system(19ae2064-76f2-47ed-9968-9de33cfc7702): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 16 18:02:13.322536 kubelet[2921]: E0116 18:02:13.322496 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-55c9fc4598-2nzbk" podUID="19ae2064-76f2-47ed-9968-9de33cfc7702" Jan 16 18:02:20.460459 kernel: kauditd_printk_skb: 233 callbacks suppressed Jan 16 18:02:20.460599 kernel: audit: type=1130 audit(1768586540.458:749): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.7.62:22-4.153.228.146:59192 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:20.458000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.7.62:22-4.153.228.146:59192 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:20.458793 systemd[1]: Started sshd@9-10.0.7.62:22-4.153.228.146:59192.service - OpenSSH per-connection server daemon (4.153.228.146:59192). Jan 16 18:02:20.602935 containerd[1658]: time="2026-01-16T18:02:20.602274282Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 18:02:20.958708 containerd[1658]: time="2026-01-16T18:02:20.958664282Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:02:20.961318 containerd[1658]: time="2026-01-16T18:02:20.961251249Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 18:02:20.961397 containerd[1658]: time="2026-01-16T18:02:20.961317570Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 18:02:20.961523 kubelet[2921]: E0116 18:02:20.961483 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:02:20.961993 kubelet[2921]: E0116 18:02:20.961533 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:02:20.961993 kubelet[2921]: E0116 18:02:20.961654 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdp42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5cdd49cc55-ck2gc_calico-apiserver(2d37882f-c058-4d23-87c0-40e1fbaf0de7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 18:02:20.962983 kubelet[2921]: E0116 18:02:20.962943 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:02:21.004000 audit[5246]: USER_ACCT pid=5246 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:21.005906 sshd[5246]: Accepted publickey for core from 4.153.228.146 port 59192 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 18:02:21.007000 audit[5246]: CRED_ACQ pid=5246 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:21.009451 sshd-session[5246]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:02:21.012541 kernel: audit: type=1101 audit(1768586541.004:750): pid=5246 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:21.012603 kernel: audit: type=1103 audit(1768586541.007:751): pid=5246 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:21.012639 kernel: audit: type=1006 audit(1768586541.007:752): pid=5246 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 16 18:02:21.014330 systemd-logind[1644]: New session 11 of user core. Jan 16 18:02:21.014568 kernel: audit: type=1300 audit(1768586541.007:752): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd31c6440 a2=3 a3=0 items=0 ppid=1 pid=5246 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:21.007000 audit[5246]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd31c6440 a2=3 a3=0 items=0 ppid=1 pid=5246 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:21.007000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:21.019326 kernel: audit: type=1327 audit(1768586541.007:752): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:21.024602 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 16 18:02:21.026000 audit[5246]: USER_START pid=5246 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:21.031000 audit[5250]: CRED_ACQ pid=5250 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:21.036397 kernel: audit: type=1105 audit(1768586541.026:753): pid=5246 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:21.036465 kernel: audit: type=1103 audit(1768586541.031:754): pid=5250 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:21.380389 sshd[5250]: Connection closed by 4.153.228.146 port 59192 Jan 16 18:02:21.381593 sshd-session[5246]: pam_unix(sshd:session): session closed for user core Jan 16 18:02:21.381000 audit[5246]: USER_END pid=5246 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:21.385848 systemd[1]: sshd@9-10.0.7.62:22-4.153.228.146:59192.service: Deactivated successfully. Jan 16 18:02:21.381000 audit[5246]: CRED_DISP pid=5246 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:21.387986 systemd[1]: session-11.scope: Deactivated successfully. Jan 16 18:02:21.390515 kernel: audit: type=1106 audit(1768586541.381:755): pid=5246 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:21.390578 kernel: audit: type=1104 audit(1768586541.381:756): pid=5246 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:21.384000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.7.62:22-4.153.228.146:59192 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:21.390579 systemd-logind[1644]: Session 11 logged out. Waiting for processes to exit. Jan 16 18:02:21.391804 systemd-logind[1644]: Removed session 11. Jan 16 18:02:24.603729 kubelet[2921]: E0116 18:02:24.603659 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:02:24.604562 containerd[1658]: time="2026-01-16T18:02:24.604482648Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 16 18:02:24.951444 containerd[1658]: time="2026-01-16T18:02:24.951352099Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:02:24.952955 containerd[1658]: time="2026-01-16T18:02:24.952876103Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 16 18:02:24.953035 containerd[1658]: time="2026-01-16T18:02:24.952972584Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 16 18:02:24.953398 kubelet[2921]: E0116 18:02:24.953077 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 18:02:24.953398 kubelet[2921]: E0116 18:02:24.953127 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 18:02:24.953594 containerd[1658]: time="2026-01-16T18:02:24.953552345Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 18:02:24.953767 kubelet[2921]: E0116 18:02:24.953331 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-84qv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7b6469dfdd-hzhpv_calico-system(cbb94aae-850a-4651-be6c-0c622d131a34): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 16 18:02:24.956448 kubelet[2921]: E0116 18:02:24.954879 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:02:25.291757 containerd[1658]: time="2026-01-16T18:02:25.290973608Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:02:25.296572 containerd[1658]: time="2026-01-16T18:02:25.296435344Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 18:02:25.296572 containerd[1658]: time="2026-01-16T18:02:25.296445864Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 18:02:25.296767 kubelet[2921]: E0116 18:02:25.296717 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:02:25.296810 kubelet[2921]: E0116 18:02:25.296770 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:02:25.297266 kubelet[2921]: E0116 18:02:25.296887 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzkc8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5cdd49cc55-ql8tz_calico-apiserver(3860e0e2-bc4a-40cc-b58d-f0b04ee81f50): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 18:02:25.298466 kubelet[2921]: E0116 18:02:25.298372 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:02:25.603355 kubelet[2921]: E0116 18:02:25.603218 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-55c9fc4598-2nzbk" podUID="19ae2064-76f2-47ed-9968-9de33cfc7702" Jan 16 18:02:26.492009 systemd[1]: Started sshd@10-10.0.7.62:22-4.153.228.146:42162.service - OpenSSH per-connection server daemon (4.153.228.146:42162). Jan 16 18:02:26.490000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.7.62:22-4.153.228.146:42162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:26.493654 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 18:02:26.493712 kernel: audit: type=1130 audit(1768586546.490:758): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.7.62:22-4.153.228.146:42162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:26.603076 containerd[1658]: time="2026-01-16T18:02:26.603038223Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 16 18:02:26.929206 containerd[1658]: time="2026-01-16T18:02:26.929156611Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:02:26.930667 containerd[1658]: time="2026-01-16T18:02:26.930234414Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 16 18:02:26.930667 containerd[1658]: time="2026-01-16T18:02:26.930278574Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 16 18:02:26.930782 kubelet[2921]: E0116 18:02:26.930581 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 18:02:26.930782 kubelet[2921]: E0116 18:02:26.930621 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 18:02:26.930782 kubelet[2921]: E0116 18:02:26.930719 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-csxnn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rb578_calico-system(08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 16 18:02:26.933033 containerd[1658]: time="2026-01-16T18:02:26.932998343Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 16 18:02:27.053738 sshd[5266]: Accepted publickey for core from 4.153.228.146 port 42162 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 18:02:27.052000 audit[5266]: USER_ACCT pid=5266 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:27.056787 sshd-session[5266]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:02:27.053000 audit[5266]: CRED_ACQ pid=5266 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:27.062376 kernel: audit: type=1101 audit(1768586547.052:759): pid=5266 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:27.062469 kernel: audit: type=1103 audit(1768586547.053:760): pid=5266 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:27.064463 kernel: audit: type=1006 audit(1768586547.053:761): pid=5266 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Jan 16 18:02:27.053000 audit[5266]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd2a52ad0 a2=3 a3=0 items=0 ppid=1 pid=5266 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:27.068308 kernel: audit: type=1300 audit(1768586547.053:761): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd2a52ad0 a2=3 a3=0 items=0 ppid=1 pid=5266 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:27.068287 systemd-logind[1644]: New session 12 of user core. Jan 16 18:02:27.053000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:27.069974 kernel: audit: type=1327 audit(1768586547.053:761): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:27.072657 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 16 18:02:27.075000 audit[5266]: USER_START pid=5266 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:27.081442 kernel: audit: type=1105 audit(1768586547.075:762): pid=5266 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:27.082534 kernel: audit: type=1103 audit(1768586547.080:763): pid=5270 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:27.080000 audit[5270]: CRED_ACQ pid=5270 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:27.268924 containerd[1658]: time="2026-01-16T18:02:27.268438439Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:02:27.269937 containerd[1658]: time="2026-01-16T18:02:27.269823283Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 16 18:02:27.269937 containerd[1658]: time="2026-01-16T18:02:27.269887763Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 16 18:02:27.270256 kubelet[2921]: E0116 18:02:27.270052 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 18:02:27.270256 kubelet[2921]: E0116 18:02:27.270096 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 18:02:27.270256 kubelet[2921]: E0116 18:02:27.270226 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-csxnn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rb578_calico-system(08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 16 18:02:27.271606 kubelet[2921]: E0116 18:02:27.271560 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:02:27.417628 sshd[5270]: Connection closed by 4.153.228.146 port 42162 Jan 16 18:02:27.418500 sshd-session[5266]: pam_unix(sshd:session): session closed for user core Jan 16 18:02:27.419000 audit[5266]: USER_END pid=5266 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:27.425885 systemd[1]: sshd@10-10.0.7.62:22-4.153.228.146:42162.service: Deactivated successfully. Jan 16 18:02:27.419000 audit[5266]: CRED_DISP pid=5266 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:27.427816 systemd[1]: session-12.scope: Deactivated successfully. Jan 16 18:02:27.429506 kernel: audit: type=1106 audit(1768586547.419:764): pid=5266 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:27.429578 kernel: audit: type=1104 audit(1768586547.419:765): pid=5266 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:27.429624 systemd-logind[1644]: Session 12 logged out. Waiting for processes to exit. Jan 16 18:02:27.424000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.7.62:22-4.153.228.146:42162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:27.430714 systemd-logind[1644]: Removed session 12. Jan 16 18:02:32.535000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.7.62:22-4.153.228.146:42166 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:32.536444 systemd[1]: Started sshd@11-10.0.7.62:22-4.153.228.146:42166.service - OpenSSH per-connection server daemon (4.153.228.146:42166). Jan 16 18:02:32.540276 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 18:02:32.540348 kernel: audit: type=1130 audit(1768586552.535:767): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.7.62:22-4.153.228.146:42166 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:33.072000 audit[5291]: USER_ACCT pid=5291 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:33.073890 sshd[5291]: Accepted publickey for core from 4.153.228.146 port 42166 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 18:02:33.075000 audit[5291]: CRED_ACQ pid=5291 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:33.077328 sshd-session[5291]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:02:33.079565 kernel: audit: type=1101 audit(1768586553.072:768): pid=5291 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:33.079628 kernel: audit: type=1103 audit(1768586553.075:769): pid=5291 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:33.081534 kernel: audit: type=1006 audit(1768586553.075:770): pid=5291 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Jan 16 18:02:33.075000 audit[5291]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc38ce3b0 a2=3 a3=0 items=0 ppid=1 pid=5291 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:33.085073 kernel: audit: type=1300 audit(1768586553.075:770): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc38ce3b0 a2=3 a3=0 items=0 ppid=1 pid=5291 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:33.075000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:33.086391 kernel: audit: type=1327 audit(1768586553.075:770): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:33.087891 systemd-logind[1644]: New session 13 of user core. Jan 16 18:02:33.098641 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 16 18:02:33.100000 audit[5291]: USER_START pid=5291 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:33.102000 audit[5295]: CRED_ACQ pid=5295 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:33.108410 kernel: audit: type=1105 audit(1768586553.100:771): pid=5291 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:33.108489 kernel: audit: type=1103 audit(1768586553.102:772): pid=5295 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:33.453831 sshd[5295]: Connection closed by 4.153.228.146 port 42166 Jan 16 18:02:33.453511 sshd-session[5291]: pam_unix(sshd:session): session closed for user core Jan 16 18:02:33.453000 audit[5291]: USER_END pid=5291 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:33.457437 systemd[1]: sshd@11-10.0.7.62:22-4.153.228.146:42166.service: Deactivated successfully. Jan 16 18:02:33.459669 systemd[1]: session-13.scope: Deactivated successfully. Jan 16 18:02:33.453000 audit[5291]: CRED_DISP pid=5291 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:33.463788 kernel: audit: type=1106 audit(1768586553.453:773): pid=5291 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:33.463851 kernel: audit: type=1104 audit(1768586553.453:774): pid=5291 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:33.463844 systemd-logind[1644]: Session 13 logged out. Waiting for processes to exit. Jan 16 18:02:33.453000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.7.62:22-4.153.228.146:42166 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:33.465208 systemd-logind[1644]: Removed session 13. Jan 16 18:02:33.570731 systemd[1]: Started sshd@12-10.0.7.62:22-4.153.228.146:42172.service - OpenSSH per-connection server daemon (4.153.228.146:42172). Jan 16 18:02:33.569000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.7.62:22-4.153.228.146:42172 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:34.131000 audit[5310]: USER_ACCT pid=5310 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:34.132651 sshd[5310]: Accepted publickey for core from 4.153.228.146 port 42172 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 18:02:34.132000 audit[5310]: CRED_ACQ pid=5310 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:34.132000 audit[5310]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd15cb830 a2=3 a3=0 items=0 ppid=1 pid=5310 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:34.132000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:34.134304 sshd-session[5310]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:02:34.138470 systemd-logind[1644]: New session 14 of user core. Jan 16 18:02:34.148840 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 16 18:02:34.149000 audit[5310]: USER_START pid=5310 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:34.150000 audit[5314]: CRED_ACQ pid=5314 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:34.530191 sshd[5314]: Connection closed by 4.153.228.146 port 42172 Jan 16 18:02:34.530591 sshd-session[5310]: pam_unix(sshd:session): session closed for user core Jan 16 18:02:34.530000 audit[5310]: USER_END pid=5310 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:34.530000 audit[5310]: CRED_DISP pid=5310 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:34.534984 systemd[1]: sshd@12-10.0.7.62:22-4.153.228.146:42172.service: Deactivated successfully. Jan 16 18:02:34.533000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.7.62:22-4.153.228.146:42172 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:34.536878 systemd[1]: session-14.scope: Deactivated successfully. Jan 16 18:02:34.537757 systemd-logind[1644]: Session 14 logged out. Waiting for processes to exit. Jan 16 18:02:34.538708 systemd-logind[1644]: Removed session 14. Jan 16 18:02:34.642000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.7.62:22-4.153.228.146:58730 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:34.643860 systemd[1]: Started sshd@13-10.0.7.62:22-4.153.228.146:58730.service - OpenSSH per-connection server daemon (4.153.228.146:58730). Jan 16 18:02:35.186000 audit[5325]: USER_ACCT pid=5325 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:35.188399 sshd[5325]: Accepted publickey for core from 4.153.228.146 port 58730 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 18:02:35.187000 audit[5325]: CRED_ACQ pid=5325 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:35.187000 audit[5325]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffffc1b020 a2=3 a3=0 items=0 ppid=1 pid=5325 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:35.187000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:35.190071 sshd-session[5325]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:02:35.193831 systemd-logind[1644]: New session 15 of user core. Jan 16 18:02:35.203585 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 16 18:02:35.204000 audit[5325]: USER_START pid=5325 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:35.206000 audit[5329]: CRED_ACQ pid=5329 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:35.551286 sshd[5329]: Connection closed by 4.153.228.146 port 58730 Jan 16 18:02:35.551391 sshd-session[5325]: pam_unix(sshd:session): session closed for user core Jan 16 18:02:35.552000 audit[5325]: USER_END pid=5325 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:35.552000 audit[5325]: CRED_DISP pid=5325 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:35.556609 systemd[1]: sshd@13-10.0.7.62:22-4.153.228.146:58730.service: Deactivated successfully. Jan 16 18:02:35.556000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.7.62:22-4.153.228.146:58730 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:35.559194 systemd[1]: session-15.scope: Deactivated successfully. Jan 16 18:02:35.559971 systemd-logind[1644]: Session 15 logged out. Waiting for processes to exit. Jan 16 18:02:35.561034 systemd-logind[1644]: Removed session 15. Jan 16 18:02:35.603544 kubelet[2921]: E0116 18:02:35.603222 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:02:35.605437 kubelet[2921]: E0116 18:02:35.604796 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:02:36.602949 kubelet[2921]: E0116 18:02:36.602804 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:02:36.603693 kubelet[2921]: E0116 18:02:36.603256 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-55c9fc4598-2nzbk" podUID="19ae2064-76f2-47ed-9968-9de33cfc7702" Jan 16 18:02:38.602068 kubelet[2921]: E0116 18:02:38.602022 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:02:39.605250 kubelet[2921]: E0116 18:02:39.605167 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:02:40.661938 systemd[1]: Started sshd@14-10.0.7.62:22-4.153.228.146:58742.service - OpenSSH per-connection server daemon (4.153.228.146:58742). Jan 16 18:02:40.660000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.7.62:22-4.153.228.146:58742 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:40.664482 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 16 18:02:40.664592 kernel: audit: type=1130 audit(1768586560.660:794): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.7.62:22-4.153.228.146:58742 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:41.183445 sshd[5342]: Accepted publickey for core from 4.153.228.146 port 58742 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 18:02:41.181000 audit[5342]: USER_ACCT pid=5342 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:41.187000 audit[5342]: CRED_ACQ pid=5342 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:41.190040 sshd-session[5342]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:02:41.192241 kernel: audit: type=1101 audit(1768586561.181:795): pid=5342 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:41.192733 kernel: audit: type=1103 audit(1768586561.187:796): pid=5342 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:41.192767 kernel: audit: type=1006 audit(1768586561.187:797): pid=5342 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Jan 16 18:02:41.187000 audit[5342]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc6ae3790 a2=3 a3=0 items=0 ppid=1 pid=5342 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:41.197497 kernel: audit: type=1300 audit(1768586561.187:797): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc6ae3790 a2=3 a3=0 items=0 ppid=1 pid=5342 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:41.187000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:41.197860 systemd-logind[1644]: New session 16 of user core. Jan 16 18:02:41.198778 kernel: audit: type=1327 audit(1768586561.187:797): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:41.204641 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 16 18:02:41.205000 audit[5342]: USER_START pid=5342 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:41.210000 audit[5346]: CRED_ACQ pid=5346 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:41.215343 kernel: audit: type=1105 audit(1768586561.205:798): pid=5342 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:41.215511 kernel: audit: type=1103 audit(1768586561.210:799): pid=5346 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:41.538610 sshd[5346]: Connection closed by 4.153.228.146 port 58742 Jan 16 18:02:41.539375 sshd-session[5342]: pam_unix(sshd:session): session closed for user core Jan 16 18:02:41.539000 audit[5342]: USER_END pid=5342 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:41.539000 audit[5342]: CRED_DISP pid=5342 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:41.547467 systemd[1]: sshd@14-10.0.7.62:22-4.153.228.146:58742.service: Deactivated successfully. Jan 16 18:02:41.547966 kernel: audit: type=1106 audit(1768586561.539:800): pid=5342 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:41.548030 kernel: audit: type=1104 audit(1768586561.539:801): pid=5342 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:41.546000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.7.62:22-4.153.228.146:58742 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:41.550210 systemd[1]: session-16.scope: Deactivated successfully. Jan 16 18:02:41.550987 systemd-logind[1644]: Session 16 logged out. Waiting for processes to exit. Jan 16 18:02:41.553875 systemd-logind[1644]: Removed session 16. Jan 16 18:02:41.652374 systemd[1]: Started sshd@15-10.0.7.62:22-4.153.228.146:58748.service - OpenSSH per-connection server daemon (4.153.228.146:58748). Jan 16 18:02:41.651000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.7.62:22-4.153.228.146:58748 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:42.198000 audit[5360]: USER_ACCT pid=5360 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:42.199948 sshd[5360]: Accepted publickey for core from 4.153.228.146 port 58748 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 18:02:42.199000 audit[5360]: CRED_ACQ pid=5360 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:42.199000 audit[5360]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe2955030 a2=3 a3=0 items=0 ppid=1 pid=5360 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:42.199000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:42.202000 sshd-session[5360]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:02:42.206902 systemd-logind[1644]: New session 17 of user core. Jan 16 18:02:42.215867 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 16 18:02:42.217000 audit[5360]: USER_START pid=5360 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:42.219000 audit[5390]: CRED_ACQ pid=5390 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:42.649486 sshd[5390]: Connection closed by 4.153.228.146 port 58748 Jan 16 18:02:42.649464 sshd-session[5360]: pam_unix(sshd:session): session closed for user core Jan 16 18:02:42.649000 audit[5360]: USER_END pid=5360 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:42.649000 audit[5360]: CRED_DISP pid=5360 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:42.655416 systemd[1]: sshd@15-10.0.7.62:22-4.153.228.146:58748.service: Deactivated successfully. Jan 16 18:02:42.654000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.7.62:22-4.153.228.146:58748 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:42.657171 systemd[1]: session-17.scope: Deactivated successfully. Jan 16 18:02:42.657996 systemd-logind[1644]: Session 17 logged out. Waiting for processes to exit. Jan 16 18:02:42.659004 systemd-logind[1644]: Removed session 17. Jan 16 18:02:42.756127 systemd[1]: Started sshd@16-10.0.7.62:22-4.153.228.146:58756.service - OpenSSH per-connection server daemon (4.153.228.146:58756). Jan 16 18:02:42.754000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.7.62:22-4.153.228.146:58756 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:43.290000 audit[5401]: USER_ACCT pid=5401 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:43.292636 sshd[5401]: Accepted publickey for core from 4.153.228.146 port 58756 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 18:02:43.292000 audit[5401]: CRED_ACQ pid=5401 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:43.292000 audit[5401]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe3d44830 a2=3 a3=0 items=0 ppid=1 pid=5401 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:43.292000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:43.295388 sshd-session[5401]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:02:43.301961 systemd-logind[1644]: New session 18 of user core. Jan 16 18:02:43.309754 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 16 18:02:43.312000 audit[5401]: USER_START pid=5401 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:43.314000 audit[5405]: CRED_ACQ pid=5405 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:43.916000 audit[5419]: NETFILTER_CFG table=filter:142 family=2 entries=26 op=nft_register_rule pid=5419 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:02:43.916000 audit[5419]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffc7a296b0 a2=0 a3=1 items=0 ppid=3072 pid=5419 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:43.916000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:02:43.929000 audit[5419]: NETFILTER_CFG table=nat:143 family=2 entries=20 op=nft_register_rule pid=5419 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:02:43.929000 audit[5419]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffc7a296b0 a2=0 a3=1 items=0 ppid=3072 pid=5419 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:43.929000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:02:43.949000 audit[5421]: NETFILTER_CFG table=filter:144 family=2 entries=38 op=nft_register_rule pid=5421 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:02:43.949000 audit[5421]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=fffff3212620 a2=0 a3=1 items=0 ppid=3072 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:43.949000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:02:43.954000 audit[5421]: NETFILTER_CFG table=nat:145 family=2 entries=20 op=nft_register_rule pid=5421 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:02:43.954000 audit[5421]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=fffff3212620 a2=0 a3=1 items=0 ppid=3072 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:43.954000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:02:44.017268 sshd[5405]: Connection closed by 4.153.228.146 port 58756 Jan 16 18:02:44.017179 sshd-session[5401]: pam_unix(sshd:session): session closed for user core Jan 16 18:02:44.019000 audit[5401]: USER_END pid=5401 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:44.019000 audit[5401]: CRED_DISP pid=5401 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:44.026032 systemd-logind[1644]: Session 18 logged out. Waiting for processes to exit. Jan 16 18:02:44.025000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.7.62:22-4.153.228.146:58756 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:44.026189 systemd[1]: sshd@16-10.0.7.62:22-4.153.228.146:58756.service: Deactivated successfully. Jan 16 18:02:44.031015 systemd[1]: session-18.scope: Deactivated successfully. Jan 16 18:02:44.040415 systemd-logind[1644]: Removed session 18. Jan 16 18:02:44.127072 systemd[1]: Started sshd@17-10.0.7.62:22-4.153.228.146:58762.service - OpenSSH per-connection server daemon (4.153.228.146:58762). Jan 16 18:02:44.125000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.7.62:22-4.153.228.146:58762 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:44.668197 sshd[5426]: Accepted publickey for core from 4.153.228.146 port 58762 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 18:02:44.666000 audit[5426]: USER_ACCT pid=5426 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:44.668000 audit[5426]: CRED_ACQ pid=5426 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:44.668000 audit[5426]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffea25b440 a2=3 a3=0 items=0 ppid=1 pid=5426 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:44.668000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:44.670765 sshd-session[5426]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:02:44.676486 systemd-logind[1644]: New session 19 of user core. Jan 16 18:02:44.686003 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 16 18:02:44.687000 audit[5426]: USER_START pid=5426 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:44.688000 audit[5430]: CRED_ACQ pid=5430 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:45.122692 sshd[5430]: Connection closed by 4.153.228.146 port 58762 Jan 16 18:02:45.123179 sshd-session[5426]: pam_unix(sshd:session): session closed for user core Jan 16 18:02:45.123000 audit[5426]: USER_END pid=5426 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:45.123000 audit[5426]: CRED_DISP pid=5426 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:45.127503 systemd[1]: sshd@17-10.0.7.62:22-4.153.228.146:58762.service: Deactivated successfully. Jan 16 18:02:45.128000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.7.62:22-4.153.228.146:58762 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:45.131304 systemd[1]: session-19.scope: Deactivated successfully. Jan 16 18:02:45.133047 systemd-logind[1644]: Session 19 logged out. Waiting for processes to exit. Jan 16 18:02:45.134525 systemd-logind[1644]: Removed session 19. Jan 16 18:02:45.230248 systemd[1]: Started sshd@18-10.0.7.62:22-4.153.228.146:59996.service - OpenSSH per-connection server daemon (4.153.228.146:59996). Jan 16 18:02:45.229000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.7.62:22-4.153.228.146:59996 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:45.779000 audit[5442]: USER_ACCT pid=5442 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:45.780960 sshd[5442]: Accepted publickey for core from 4.153.228.146 port 59996 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 18:02:45.782054 kernel: kauditd_printk_skb: 47 callbacks suppressed Jan 16 18:02:45.787880 sshd-session[5442]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:02:45.786000 audit[5442]: CRED_ACQ pid=5442 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:45.791044 kernel: audit: type=1101 audit(1768586565.779:835): pid=5442 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:45.791145 kernel: audit: type=1103 audit(1768586565.786:836): pid=5442 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:45.786000 audit[5442]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff90272f0 a2=3 a3=0 items=0 ppid=1 pid=5442 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:45.799936 kernel: audit: type=1006 audit(1768586565.786:837): pid=5442 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Jan 16 18:02:45.800023 kernel: audit: type=1300 audit(1768586565.786:837): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff90272f0 a2=3 a3=0 items=0 ppid=1 pid=5442 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:45.786000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:45.804496 kernel: audit: type=1327 audit(1768586565.786:837): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:45.804906 systemd-logind[1644]: New session 20 of user core. Jan 16 18:02:45.810770 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 16 18:02:45.812000 audit[5442]: USER_START pid=5442 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:45.814000 audit[5446]: CRED_ACQ pid=5446 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:45.820663 kernel: audit: type=1105 audit(1768586565.812:838): pid=5442 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:45.820738 kernel: audit: type=1103 audit(1768586565.814:839): pid=5446 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:46.152783 sshd[5446]: Connection closed by 4.153.228.146 port 59996 Jan 16 18:02:46.152231 sshd-session[5442]: pam_unix(sshd:session): session closed for user core Jan 16 18:02:46.152000 audit[5442]: USER_END pid=5442 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:46.156553 systemd[1]: session-20.scope: Deactivated successfully. Jan 16 18:02:46.157345 systemd[1]: sshd@18-10.0.7.62:22-4.153.228.146:59996.service: Deactivated successfully. Jan 16 18:02:46.152000 audit[5442]: CRED_DISP pid=5442 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:46.160983 kernel: audit: type=1106 audit(1768586566.152:840): pid=5442 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:46.161083 kernel: audit: type=1104 audit(1768586566.152:841): pid=5442 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:46.156000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.7.62:22-4.153.228.146:59996 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:46.161491 systemd-logind[1644]: Session 20 logged out. Waiting for processes to exit. Jan 16 18:02:46.164361 kernel: audit: type=1131 audit(1768586566.156:842): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.7.62:22-4.153.228.146:59996 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:46.165739 systemd-logind[1644]: Removed session 20. Jan 16 18:02:46.602227 kubelet[2921]: E0116 18:02:46.602094 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:02:48.167000 audit[5459]: NETFILTER_CFG table=filter:146 family=2 entries=26 op=nft_register_rule pid=5459 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:02:48.167000 audit[5459]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffd80342b0 a2=0 a3=1 items=0 ppid=3072 pid=5459 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:48.167000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:02:48.175000 audit[5459]: NETFILTER_CFG table=nat:147 family=2 entries=104 op=nft_register_chain pid=5459 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:02:48.175000 audit[5459]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=48684 a0=3 a1=ffffd80342b0 a2=0 a3=1 items=0 ppid=3072 pid=5459 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:48.175000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:02:48.602665 kubelet[2921]: E0116 18:02:48.602446 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:02:49.605487 kubelet[2921]: E0116 18:02:49.604106 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:02:50.603259 kubelet[2921]: E0116 18:02:50.603132 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:02:50.604275 kubelet[2921]: E0116 18:02:50.604211 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:02:51.258000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.7.62:22-4.153.228.146:60012 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:51.260219 systemd[1]: Started sshd@19-10.0.7.62:22-4.153.228.146:60012.service - OpenSSH per-connection server daemon (4.153.228.146:60012). Jan 16 18:02:51.263746 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 16 18:02:51.263815 kernel: audit: type=1130 audit(1768586571.258:845): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.7.62:22-4.153.228.146:60012 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:51.606843 kubelet[2921]: E0116 18:02:51.606632 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-55c9fc4598-2nzbk" podUID="19ae2064-76f2-47ed-9968-9de33cfc7702" Jan 16 18:02:51.790000 audit[5461]: USER_ACCT pid=5461 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:51.792281 sshd[5461]: Accepted publickey for core from 4.153.228.146 port 60012 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 18:02:51.794000 audit[5461]: CRED_ACQ pid=5461 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:51.796596 sshd-session[5461]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:02:51.799431 kernel: audit: type=1101 audit(1768586571.790:846): pid=5461 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:51.799533 kernel: audit: type=1103 audit(1768586571.794:847): pid=5461 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:51.799552 kernel: audit: type=1006 audit(1768586571.794:848): pid=5461 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Jan 16 18:02:51.801364 kernel: audit: type=1300 audit(1768586571.794:848): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff1acf560 a2=3 a3=0 items=0 ppid=1 pid=5461 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:51.794000 audit[5461]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff1acf560 a2=3 a3=0 items=0 ppid=1 pid=5461 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:51.794000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:51.806173 kernel: audit: type=1327 audit(1768586571.794:848): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:51.807895 systemd-logind[1644]: New session 21 of user core. Jan 16 18:02:51.812657 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 16 18:02:51.814000 audit[5461]: USER_START pid=5461 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:51.818000 audit[5465]: CRED_ACQ pid=5465 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:51.823305 kernel: audit: type=1105 audit(1768586571.814:849): pid=5461 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:51.823370 kernel: audit: type=1103 audit(1768586571.818:850): pid=5465 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:52.162446 sshd[5465]: Connection closed by 4.153.228.146 port 60012 Jan 16 18:02:52.162596 sshd-session[5461]: pam_unix(sshd:session): session closed for user core Jan 16 18:02:52.162000 audit[5461]: USER_END pid=5461 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:52.168174 systemd[1]: sshd@19-10.0.7.62:22-4.153.228.146:60012.service: Deactivated successfully. Jan 16 18:02:52.162000 audit[5461]: CRED_DISP pid=5461 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:52.171393 systemd[1]: session-21.scope: Deactivated successfully. Jan 16 18:02:52.174094 kernel: audit: type=1106 audit(1768586572.162:851): pid=5461 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:52.174193 kernel: audit: type=1104 audit(1768586572.162:852): pid=5461 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:52.174211 systemd-logind[1644]: Session 21 logged out. Waiting for processes to exit. Jan 16 18:02:52.177496 systemd-logind[1644]: Removed session 21. Jan 16 18:02:52.167000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.7.62:22-4.153.228.146:60012 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:57.273000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.7.62:22-4.153.228.146:50800 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:57.276066 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 18:02:57.276213 kernel: audit: type=1130 audit(1768586577.273:854): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.7.62:22-4.153.228.146:50800 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:57.274925 systemd[1]: Started sshd@20-10.0.7.62:22-4.153.228.146:50800.service - OpenSSH per-connection server daemon (4.153.228.146:50800). Jan 16 18:02:57.607987 kubelet[2921]: E0116 18:02:57.607551 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:02:57.818000 audit[5480]: USER_ACCT pid=5480 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:57.820347 sshd[5480]: Accepted publickey for core from 4.153.228.146 port 50800 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 18:02:57.824440 kernel: audit: type=1101 audit(1768586577.818:855): pid=5480 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:57.824495 kernel: audit: type=1103 audit(1768586577.822:856): pid=5480 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:57.822000 audit[5480]: CRED_ACQ pid=5480 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:57.825165 sshd-session[5480]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:02:57.829705 kernel: audit: type=1006 audit(1768586577.823:857): pid=5480 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Jan 16 18:02:57.823000 audit[5480]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffed4b0e80 a2=3 a3=0 items=0 ppid=1 pid=5480 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:57.831957 systemd-logind[1644]: New session 22 of user core. Jan 16 18:02:57.833643 kernel: audit: type=1300 audit(1768586577.823:857): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffed4b0e80 a2=3 a3=0 items=0 ppid=1 pid=5480 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:02:57.823000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:57.834968 kernel: audit: type=1327 audit(1768586577.823:857): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:02:57.845709 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 16 18:02:57.846000 audit[5480]: USER_START pid=5480 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:57.848000 audit[5484]: CRED_ACQ pid=5484 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:57.855128 kernel: audit: type=1105 audit(1768586577.846:858): pid=5480 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:57.855205 kernel: audit: type=1103 audit(1768586577.848:859): pid=5484 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:58.179187 sshd[5484]: Connection closed by 4.153.228.146 port 50800 Jan 16 18:02:58.179541 sshd-session[5480]: pam_unix(sshd:session): session closed for user core Jan 16 18:02:58.179000 audit[5480]: USER_END pid=5480 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:58.184926 systemd[1]: sshd@20-10.0.7.62:22-4.153.228.146:50800.service: Deactivated successfully. Jan 16 18:02:58.179000 audit[5480]: CRED_DISP pid=5480 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:58.186874 systemd[1]: session-22.scope: Deactivated successfully. Jan 16 18:02:58.187985 kernel: audit: type=1106 audit(1768586578.179:860): pid=5480 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:58.188160 kernel: audit: type=1104 audit(1768586578.179:861): pid=5480 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:02:58.183000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.7.62:22-4.153.228.146:50800 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:02:58.188224 systemd-logind[1644]: Session 22 logged out. Waiting for processes to exit. Jan 16 18:02:58.189857 systemd-logind[1644]: Removed session 22. Jan 16 18:02:59.602619 kubelet[2921]: E0116 18:02:59.602546 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:03:03.287000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.7.62:22-4.153.228.146:50814 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:03:03.289298 systemd[1]: Started sshd@21-10.0.7.62:22-4.153.228.146:50814.service - OpenSSH per-connection server daemon (4.153.228.146:50814). Jan 16 18:03:03.290458 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 18:03:03.290523 kernel: audit: type=1130 audit(1768586583.287:863): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.7.62:22-4.153.228.146:50814 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:03:03.606527 kubelet[2921]: E0116 18:03:03.605239 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:03:03.606527 kubelet[2921]: E0116 18:03:03.605442 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:03:03.606527 kubelet[2921]: E0116 18:03:03.606183 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:03:03.809000 audit[5500]: USER_ACCT pid=5500 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:03.812585 sshd[5500]: Accepted publickey for core from 4.153.228.146 port 50814 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 18:03:03.814191 sshd-session[5500]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:03:03.811000 audit[5500]: CRED_ACQ pid=5500 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:03.817643 kernel: audit: type=1101 audit(1768586583.809:864): pid=5500 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:03.817720 kernel: audit: type=1103 audit(1768586583.811:865): pid=5500 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:03.819592 kernel: audit: type=1006 audit(1768586583.811:866): pid=5500 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Jan 16 18:03:03.811000 audit[5500]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff893cf10 a2=3 a3=0 items=0 ppid=1 pid=5500 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:03.823318 kernel: audit: type=1300 audit(1768586583.811:866): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff893cf10 a2=3 a3=0 items=0 ppid=1 pid=5500 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:03.811000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:03:03.824645 kernel: audit: type=1327 audit(1768586583.811:866): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:03:03.827896 systemd-logind[1644]: New session 23 of user core. Jan 16 18:03:03.837616 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 16 18:03:03.839000 audit[5500]: USER_START pid=5500 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:03.844000 audit[5504]: CRED_ACQ pid=5504 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:03.848418 kernel: audit: type=1105 audit(1768586583.839:867): pid=5500 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:03.848576 kernel: audit: type=1103 audit(1768586583.844:868): pid=5504 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:04.167841 sshd[5504]: Connection closed by 4.153.228.146 port 50814 Jan 16 18:03:04.168384 sshd-session[5500]: pam_unix(sshd:session): session closed for user core Jan 16 18:03:04.168000 audit[5500]: USER_END pid=5500 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:04.172464 systemd[1]: sshd@21-10.0.7.62:22-4.153.228.146:50814.service: Deactivated successfully. Jan 16 18:03:04.168000 audit[5500]: CRED_DISP pid=5500 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:04.174990 systemd[1]: session-23.scope: Deactivated successfully. Jan 16 18:03:04.176162 kernel: audit: type=1106 audit(1768586584.168:869): pid=5500 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:04.176217 kernel: audit: type=1104 audit(1768586584.168:870): pid=5500 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:04.172000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.7.62:22-4.153.228.146:50814 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:03:04.176866 systemd-logind[1644]: Session 23 logged out. Waiting for processes to exit. Jan 16 18:03:04.177626 systemd-logind[1644]: Removed session 23. Jan 16 18:03:06.602899 kubelet[2921]: E0116 18:03:06.602841 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-55c9fc4598-2nzbk" podUID="19ae2064-76f2-47ed-9968-9de33cfc7702" Jan 16 18:03:09.277000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.7.62:22-4.153.228.146:43996 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:03:09.279661 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 18:03:09.279706 kernel: audit: type=1130 audit(1768586589.277:872): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.7.62:22-4.153.228.146:43996 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:03:09.278756 systemd[1]: Started sshd@22-10.0.7.62:22-4.153.228.146:43996.service - OpenSSH per-connection server daemon (4.153.228.146:43996). Jan 16 18:03:09.817863 sshd[5518]: Accepted publickey for core from 4.153.228.146 port 43996 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 18:03:09.816000 audit[5518]: USER_ACCT pid=5518 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:09.819753 sshd-session[5518]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:03:09.817000 audit[5518]: CRED_ACQ pid=5518 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:09.824241 kernel: audit: type=1101 audit(1768586589.816:873): pid=5518 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:09.824306 kernel: audit: type=1103 audit(1768586589.817:874): pid=5518 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:09.824336 kernel: audit: type=1006 audit(1768586589.817:875): pid=5518 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Jan 16 18:03:09.817000 audit[5518]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd97edfc0 a2=3 a3=0 items=0 ppid=1 pid=5518 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:09.828271 systemd-logind[1644]: New session 24 of user core. Jan 16 18:03:09.829527 kernel: audit: type=1300 audit(1768586589.817:875): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd97edfc0 a2=3 a3=0 items=0 ppid=1 pid=5518 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:09.829619 kernel: audit: type=1327 audit(1768586589.817:875): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:03:09.817000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:03:09.836608 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 16 18:03:09.840000 audit[5518]: USER_START pid=5518 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:09.841000 audit[5522]: CRED_ACQ pid=5522 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:09.851121 kernel: audit: type=1105 audit(1768586589.840:876): pid=5518 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:09.851223 kernel: audit: type=1103 audit(1768586589.841:877): pid=5522 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:10.175024 sshd[5522]: Connection closed by 4.153.228.146 port 43996 Jan 16 18:03:10.176072 sshd-session[5518]: pam_unix(sshd:session): session closed for user core Jan 16 18:03:10.177000 audit[5518]: USER_END pid=5518 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:10.182519 systemd-logind[1644]: Session 24 logged out. Waiting for processes to exit. Jan 16 18:03:10.182628 systemd[1]: sshd@22-10.0.7.62:22-4.153.228.146:43996.service: Deactivated successfully. Jan 16 18:03:10.177000 audit[5518]: CRED_DISP pid=5518 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:10.185900 kernel: audit: type=1106 audit(1768586590.177:878): pid=5518 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:10.185968 kernel: audit: type=1104 audit(1768586590.177:879): pid=5518 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:10.181000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.7.62:22-4.153.228.146:43996 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:03:10.186123 systemd[1]: session-24.scope: Deactivated successfully. Jan 16 18:03:10.188810 systemd-logind[1644]: Removed session 24. Jan 16 18:03:10.602640 kubelet[2921]: E0116 18:03:10.602539 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:03:10.604250 kubelet[2921]: E0116 18:03:10.604212 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:03:15.283824 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 18:03:15.283926 kernel: audit: type=1130 audit(1768586595.281:881): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.7.62:22-4.153.228.146:33896 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:03:15.281000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.7.62:22-4.153.228.146:33896 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:03:15.282862 systemd[1]: Started sshd@23-10.0.7.62:22-4.153.228.146:33896.service - OpenSSH per-connection server daemon (4.153.228.146:33896). Jan 16 18:03:15.812000 audit[5559]: USER_ACCT pid=5559 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:15.814587 sshd[5559]: Accepted publickey for core from 4.153.228.146 port 33896 ssh2: RSA SHA256:oeD2Uxu/dx5g2/RqBa/y8xsSs9TWdr1HcWxT68/O3TM Jan 16 18:03:15.818464 kernel: audit: type=1101 audit(1768586595.812:882): pid=5559 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:15.817000 audit[5559]: CRED_ACQ pid=5559 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:15.820206 sshd-session[5559]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:03:15.823769 kernel: audit: type=1103 audit(1768586595.817:883): pid=5559 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:15.823834 kernel: audit: type=1006 audit(1768586595.817:884): pid=5559 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Jan 16 18:03:15.824086 kernel: audit: type=1300 audit(1768586595.817:884): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffed7ef190 a2=3 a3=0 items=0 ppid=1 pid=5559 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:15.817000 audit[5559]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffed7ef190 a2=3 a3=0 items=0 ppid=1 pid=5559 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:15.817000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:03:15.828793 kernel: audit: type=1327 audit(1768586595.817:884): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:03:15.831130 systemd-logind[1644]: New session 25 of user core. Jan 16 18:03:15.842618 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 16 18:03:15.843000 audit[5559]: USER_START pid=5559 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:15.845000 audit[5563]: CRED_ACQ pid=5563 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:15.851925 kernel: audit: type=1105 audit(1768586595.843:885): pid=5559 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:15.851989 kernel: audit: type=1103 audit(1768586595.845:886): pid=5563 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:16.180125 sshd[5563]: Connection closed by 4.153.228.146 port 33896 Jan 16 18:03:16.180430 sshd-session[5559]: pam_unix(sshd:session): session closed for user core Jan 16 18:03:16.181000 audit[5559]: USER_END pid=5559 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:16.185868 systemd-logind[1644]: Session 25 logged out. Waiting for processes to exit. Jan 16 18:03:16.186117 systemd[1]: sshd@23-10.0.7.62:22-4.153.228.146:33896.service: Deactivated successfully. Jan 16 18:03:16.181000 audit[5559]: CRED_DISP pid=5559 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:16.187813 systemd[1]: session-25.scope: Deactivated successfully. Jan 16 18:03:16.189893 kernel: audit: type=1106 audit(1768586596.181:887): pid=5559 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:16.189951 kernel: audit: type=1104 audit(1768586596.181:888): pid=5559 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 16 18:03:16.185000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.7.62:22-4.153.228.146:33896 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:03:16.190431 systemd-logind[1644]: Removed session 25. Jan 16 18:03:16.603312 kubelet[2921]: E0116 18:03:16.602942 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:03:16.604140 kubelet[2921]: E0116 18:03:16.604065 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:03:17.602997 kubelet[2921]: E0116 18:03:17.602755 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:03:21.602802 kubelet[2921]: E0116 18:03:21.602739 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-55c9fc4598-2nzbk" podUID="19ae2064-76f2-47ed-9968-9de33cfc7702" Jan 16 18:03:22.602826 kubelet[2921]: E0116 18:03:22.602465 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:03:23.604055 kubelet[2921]: E0116 18:03:23.604018 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:03:28.602809 kubelet[2921]: E0116 18:03:28.602750 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:03:29.603997 kubelet[2921]: E0116 18:03:29.603793 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:03:31.605670 kubelet[2921]: E0116 18:03:31.605345 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:03:34.444480 update_engine[1645]: I20260116 18:03:34.443522 1645 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 16 18:03:34.444480 update_engine[1645]: I20260116 18:03:34.443576 1645 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 16 18:03:34.444480 update_engine[1645]: I20260116 18:03:34.443807 1645 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 16 18:03:34.444480 update_engine[1645]: I20260116 18:03:34.444140 1645 omaha_request_params.cc:62] Current group set to developer Jan 16 18:03:34.444480 update_engine[1645]: I20260116 18:03:34.444232 1645 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 16 18:03:34.444480 update_engine[1645]: I20260116 18:03:34.444242 1645 update_attempter.cc:643] Scheduling an action processor start. Jan 16 18:03:34.444480 update_engine[1645]: I20260116 18:03:34.444267 1645 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 16 18:03:34.445340 update_engine[1645]: I20260116 18:03:34.444515 1645 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 16 18:03:34.445340 update_engine[1645]: I20260116 18:03:34.444564 1645 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 16 18:03:34.445340 update_engine[1645]: I20260116 18:03:34.444571 1645 omaha_request_action.cc:272] Request: Jan 16 18:03:34.445340 update_engine[1645]: Jan 16 18:03:34.445340 update_engine[1645]: Jan 16 18:03:34.445340 update_engine[1645]: Jan 16 18:03:34.445340 update_engine[1645]: Jan 16 18:03:34.445340 update_engine[1645]: Jan 16 18:03:34.445340 update_engine[1645]: Jan 16 18:03:34.445340 update_engine[1645]: Jan 16 18:03:34.445340 update_engine[1645]: Jan 16 18:03:34.445340 update_engine[1645]: I20260116 18:03:34.444602 1645 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 16 18:03:34.446032 locksmithd[1701]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 16 18:03:34.446525 update_engine[1645]: I20260116 18:03:34.446483 1645 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 16 18:03:34.447300 update_engine[1645]: I20260116 18:03:34.447259 1645 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 16 18:03:34.458561 update_engine[1645]: E20260116 18:03:34.458508 1645 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 16 18:03:34.458710 update_engine[1645]: I20260116 18:03:34.458627 1645 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 16 18:03:34.602835 containerd[1658]: time="2026-01-16T18:03:34.602758608Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 16 18:03:34.947351 containerd[1658]: time="2026-01-16T18:03:34.947047091Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:03:34.950319 containerd[1658]: time="2026-01-16T18:03:34.950282701Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 16 18:03:34.950542 containerd[1658]: time="2026-01-16T18:03:34.950367781Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 16 18:03:34.950717 kubelet[2921]: E0116 18:03:34.950657 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 18:03:34.950717 kubelet[2921]: E0116 18:03:34.950710 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 18:03:34.951081 kubelet[2921]: E0116 18:03:34.950809 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:0b1f42d8785e424d85c2c2149fa8b5bd,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6vqqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-55c9fc4598-2nzbk_calico-system(19ae2064-76f2-47ed-9968-9de33cfc7702): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 16 18:03:34.952776 containerd[1658]: time="2026-01-16T18:03:34.952748429Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 16 18:03:35.302594 containerd[1658]: time="2026-01-16T18:03:35.302369368Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:03:35.308871 containerd[1658]: time="2026-01-16T18:03:35.308749467Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 16 18:03:35.308871 containerd[1658]: time="2026-01-16T18:03:35.308823548Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 16 18:03:35.309018 kubelet[2921]: E0116 18:03:35.308977 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 18:03:35.309057 kubelet[2921]: E0116 18:03:35.309033 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 18:03:35.309192 kubelet[2921]: E0116 18:03:35.309146 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6vqqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-55c9fc4598-2nzbk_calico-system(19ae2064-76f2-47ed-9968-9de33cfc7702): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 16 18:03:35.310381 kubelet[2921]: E0116 18:03:35.310321 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-55c9fc4598-2nzbk" podUID="19ae2064-76f2-47ed-9968-9de33cfc7702" Jan 16 18:03:35.603355 kubelet[2921]: E0116 18:03:35.603241 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:03:36.602917 containerd[1658]: time="2026-01-16T18:03:36.602846708Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 16 18:03:36.938939 containerd[1658]: time="2026-01-16T18:03:36.938892086Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:03:36.942789 containerd[1658]: time="2026-01-16T18:03:36.942713138Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 16 18:03:36.942922 containerd[1658]: time="2026-01-16T18:03:36.942796058Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 16 18:03:36.943027 kubelet[2921]: E0116 18:03:36.942946 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 18:03:36.943027 kubelet[2921]: E0116 18:03:36.942990 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 18:03:36.943365 kubelet[2921]: E0116 18:03:36.943107 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbbxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-j5dxx_calico-system(64da405a-12aa-44ec-8b4d-a44866f591ec): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 16 18:03:36.944298 kubelet[2921]: E0116 18:03:36.944260 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:03:42.601742 kubelet[2921]: E0116 18:03:42.601701 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:03:43.602991 containerd[1658]: time="2026-01-16T18:03:43.602904384Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 18:03:43.936081 containerd[1658]: time="2026-01-16T18:03:43.935987509Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:03:43.937790 containerd[1658]: time="2026-01-16T18:03:43.937747274Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 18:03:43.937910 containerd[1658]: time="2026-01-16T18:03:43.937824034Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 18:03:43.937998 kubelet[2921]: E0116 18:03:43.937953 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:03:43.938330 kubelet[2921]: E0116 18:03:43.938007 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:03:43.938330 kubelet[2921]: E0116 18:03:43.938136 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdp42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5cdd49cc55-ck2gc_calico-apiserver(2d37882f-c058-4d23-87c0-40e1fbaf0de7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 18:03:43.939328 kubelet[2921]: E0116 18:03:43.939279 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:03:44.434537 update_engine[1645]: I20260116 18:03:44.434098 1645 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 16 18:03:44.434537 update_engine[1645]: I20260116 18:03:44.434206 1645 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 16 18:03:44.435134 update_engine[1645]: I20260116 18:03:44.434707 1645 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 16 18:03:44.444245 update_engine[1645]: E20260116 18:03:44.444198 1645 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 16 18:03:44.444298 update_engine[1645]: I20260116 18:03:44.444273 1645 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jan 16 18:03:46.602031 kubelet[2921]: E0116 18:03:46.601969 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3" Jan 16 18:03:47.059612 kubelet[2921]: E0116 18:03:47.059492 2921 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.7.62:60460->10.0.7.107:2379: read: connection timed out" event="&Event{ObjectMeta:{goldmane-666569f655-j5dxx.188b4801451652fc calico-system 1313 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-system,Name:goldmane-666569f655-j5dxx,UID:64da405a-12aa-44ec-8b4d-a44866f591ec,APIVersion:v1,ResourceVersion:813,FieldPath:spec.containers{goldmane},},Reason:Pulling,Message:Pulling image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4580-0-0-p-7f6b5ebc40,},FirstTimestamp:2026-01-16 18:00:48 +0000 UTC,LastTimestamp:2026-01-16 18:03:36.602576027 +0000 UTC m=+223.090363314,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4580-0-0-p-7f6b5ebc40,}" Jan 16 18:03:47.603027 kubelet[2921]: E0116 18:03:47.602957 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-55c9fc4598-2nzbk" podUID="19ae2064-76f2-47ed-9968-9de33cfc7702" Jan 16 18:03:47.643217 systemd[1]: cri-containerd-9faa95818482f58af12cff6b2566c18653b9fe7925f9e1535b63a59d8fa9c546.scope: Deactivated successfully. Jan 16 18:03:47.643575 systemd[1]: cri-containerd-9faa95818482f58af12cff6b2566c18653b9fe7925f9e1535b63a59d8fa9c546.scope: Consumed 5.041s CPU time, 58M memory peak. Jan 16 18:03:47.645804 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 18:03:47.645860 kernel: audit: type=1334 audit(1768586627.642:890): prog-id=256 op=LOAD Jan 16 18:03:47.642000 audit: BPF prog-id=256 op=LOAD Jan 16 18:03:47.642000 audit: BPF prog-id=88 op=UNLOAD Jan 16 18:03:47.647096 kernel: audit: type=1334 audit(1768586627.642:891): prog-id=88 op=UNLOAD Jan 16 18:03:47.647509 containerd[1658]: time="2026-01-16T18:03:47.647469863Z" level=info msg="received container exit event container_id:\"9faa95818482f58af12cff6b2566c18653b9fe7925f9e1535b63a59d8fa9c546\" id:\"9faa95818482f58af12cff6b2566c18653b9fe7925f9e1535b63a59d8fa9c546\" pid:2775 exit_status:1 exited_at:{seconds:1768586627 nanos:646568060}" Jan 16 18:03:47.647000 audit: BPF prog-id=103 op=UNLOAD Jan 16 18:03:47.647000 audit: BPF prog-id=107 op=UNLOAD Jan 16 18:03:47.650557 kernel: audit: type=1334 audit(1768586627.647:892): prog-id=103 op=UNLOAD Jan 16 18:03:47.650609 kernel: audit: type=1334 audit(1768586627.647:893): prog-id=107 op=UNLOAD Jan 16 18:03:47.667281 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9faa95818482f58af12cff6b2566c18653b9fe7925f9e1535b63a59d8fa9c546-rootfs.mount: Deactivated successfully. Jan 16 18:03:47.893194 kubelet[2921]: E0116 18:03:47.893069 2921 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.7.62:60628->10.0.7.107:2379: read: connection timed out" Jan 16 18:03:47.896716 systemd[1]: cri-containerd-9bf6682c38eeafe1f91e0d8d24ff6f42acc1ff588bf5c6ddda9ebaabc6b16041.scope: Deactivated successfully. Jan 16 18:03:47.897031 systemd[1]: cri-containerd-9bf6682c38eeafe1f91e0d8d24ff6f42acc1ff588bf5c6ddda9ebaabc6b16041.scope: Consumed 4.918s CPU time, 23.1M memory peak. Jan 16 18:03:47.896000 audit: BPF prog-id=257 op=LOAD Jan 16 18:03:47.899571 containerd[1658]: time="2026-01-16T18:03:47.899533467Z" level=info msg="received container exit event container_id:\"9bf6682c38eeafe1f91e0d8d24ff6f42acc1ff588bf5c6ddda9ebaabc6b16041\" id:\"9bf6682c38eeafe1f91e0d8d24ff6f42acc1ff588bf5c6ddda9ebaabc6b16041\" pid:2782 exit_status:1 exited_at:{seconds:1768586627 nanos:898587744}" Jan 16 18:03:47.896000 audit: BPF prog-id=93 op=UNLOAD Jan 16 18:03:47.900825 kernel: audit: type=1334 audit(1768586627.896:894): prog-id=257 op=LOAD Jan 16 18:03:47.900934 kernel: audit: type=1334 audit(1768586627.896:895): prog-id=93 op=UNLOAD Jan 16 18:03:47.900967 kernel: audit: type=1334 audit(1768586627.899:896): prog-id=108 op=UNLOAD Jan 16 18:03:47.899000 audit: BPF prog-id=108 op=UNLOAD Jan 16 18:03:47.902612 kernel: audit: type=1334 audit(1768586627.899:897): prog-id=112 op=UNLOAD Jan 16 18:03:47.899000 audit: BPF prog-id=112 op=UNLOAD Jan 16 18:03:47.919179 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9bf6682c38eeafe1f91e0d8d24ff6f42acc1ff588bf5c6ddda9ebaabc6b16041-rootfs.mount: Deactivated successfully. Jan 16 18:03:48.057907 systemd[1]: cri-containerd-f7613897bdfe27a3309792e3ca6c3cdfdac7c69f80eca7431c266102fe31f56f.scope: Deactivated successfully. Jan 16 18:03:48.058236 systemd[1]: cri-containerd-f7613897bdfe27a3309792e3ca6c3cdfdac7c69f80eca7431c266102fe31f56f.scope: Consumed 39.191s CPU time, 117.5M memory peak. Jan 16 18:03:48.060830 containerd[1658]: time="2026-01-16T18:03:48.060796555Z" level=info msg="received container exit event container_id:\"f7613897bdfe27a3309792e3ca6c3cdfdac7c69f80eca7431c266102fe31f56f\" id:\"f7613897bdfe27a3309792e3ca6c3cdfdac7c69f80eca7431c266102fe31f56f\" pid:3242 exit_status:1 exited_at:{seconds:1768586628 nanos:60545354}" Jan 16 18:03:48.060000 audit: BPF prog-id=146 op=UNLOAD Jan 16 18:03:48.060000 audit: BPF prog-id=150 op=UNLOAD Jan 16 18:03:48.063550 kernel: audit: type=1334 audit(1768586628.060:898): prog-id=146 op=UNLOAD Jan 16 18:03:48.063618 kernel: audit: type=1334 audit(1768586628.060:899): prog-id=150 op=UNLOAD Jan 16 18:03:48.082020 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f7613897bdfe27a3309792e3ca6c3cdfdac7c69f80eca7431c266102fe31f56f-rootfs.mount: Deactivated successfully. Jan 16 18:03:48.158910 kubelet[2921]: I0116 18:03:48.158835 2921 scope.go:117] "RemoveContainer" containerID="9bf6682c38eeafe1f91e0d8d24ff6f42acc1ff588bf5c6ddda9ebaabc6b16041" Jan 16 18:03:48.160723 containerd[1658]: time="2026-01-16T18:03:48.160680378Z" level=info msg="CreateContainer within sandbox \"8bf801f1f330784ad07cd6cec66806776bcf15f176e992b41635d516a430ebf0\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 16 18:03:48.161744 kubelet[2921]: I0116 18:03:48.161723 2921 scope.go:117] "RemoveContainer" containerID="9faa95818482f58af12cff6b2566c18653b9fe7925f9e1535b63a59d8fa9c546" Jan 16 18:03:48.162106 kubelet[2921]: I0116 18:03:48.162090 2921 scope.go:117] "RemoveContainer" containerID="f7613897bdfe27a3309792e3ca6c3cdfdac7c69f80eca7431c266102fe31f56f" Jan 16 18:03:48.163298 containerd[1658]: time="2026-01-16T18:03:48.163259066Z" level=info msg="CreateContainer within sandbox \"c753fbe6018164acdeec60a62a596185ee77b85ddc6e82e0db403e8c654a9197\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 16 18:03:48.163911 containerd[1658]: time="2026-01-16T18:03:48.163877227Z" level=info msg="CreateContainer within sandbox \"c251e3cff3b9fc6edd0485af3e7894b38b01802328d56ea2304862e369604a2d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 16 18:03:48.184457 containerd[1658]: time="2026-01-16T18:03:48.184399490Z" level=info msg="Container c8ba2a8a84f78f9894479b1af1ff818b6560010f18618555ae19c84b947e6334: CDI devices from CRI Config.CDIDevices: []" Jan 16 18:03:48.191811 containerd[1658]: time="2026-01-16T18:03:48.191710472Z" level=info msg="Container b9f8553462735a948e96477687dc9febb6aaf68a05e7e36301b27ddf463b15a9: CDI devices from CRI Config.CDIDevices: []" Jan 16 18:03:48.201465 containerd[1658]: time="2026-01-16T18:03:48.201388341Z" level=info msg="Container 5af133a973258898fbb269f2fb793401339fbb27facf0399a6d7fb6d0c7f88f9: CDI devices from CRI Config.CDIDevices: []" Jan 16 18:03:48.210799 containerd[1658]: time="2026-01-16T18:03:48.210758649Z" level=info msg="CreateContainer within sandbox \"8bf801f1f330784ad07cd6cec66806776bcf15f176e992b41635d516a430ebf0\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"c8ba2a8a84f78f9894479b1af1ff818b6560010f18618555ae19c84b947e6334\"" Jan 16 18:03:48.211300 containerd[1658]: time="2026-01-16T18:03:48.211260291Z" level=info msg="StartContainer for \"c8ba2a8a84f78f9894479b1af1ff818b6560010f18618555ae19c84b947e6334\"" Jan 16 18:03:48.212344 containerd[1658]: time="2026-01-16T18:03:48.212311214Z" level=info msg="connecting to shim c8ba2a8a84f78f9894479b1af1ff818b6560010f18618555ae19c84b947e6334" address="unix:///run/containerd/s/586f78604bf3c30db8b94bb7d3c443759db9b48c7b1fd97f938c0ab4a105d37b" protocol=ttrpc version=3 Jan 16 18:03:48.222914 containerd[1658]: time="2026-01-16T18:03:48.222874326Z" level=info msg="CreateContainer within sandbox \"c251e3cff3b9fc6edd0485af3e7894b38b01802328d56ea2304862e369604a2d\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"5af133a973258898fbb269f2fb793401339fbb27facf0399a6d7fb6d0c7f88f9\"" Jan 16 18:03:48.224203 containerd[1658]: time="2026-01-16T18:03:48.224148850Z" level=info msg="StartContainer for \"5af133a973258898fbb269f2fb793401339fbb27facf0399a6d7fb6d0c7f88f9\"" Jan 16 18:03:48.225646 containerd[1658]: time="2026-01-16T18:03:48.225622335Z" level=info msg="connecting to shim 5af133a973258898fbb269f2fb793401339fbb27facf0399a6d7fb6d0c7f88f9" address="unix:///run/containerd/s/3ad9b1bba85fc187dd2de2bc469d137db727bb5570ed6e4303c1ec5259ca0681" protocol=ttrpc version=3 Jan 16 18:03:48.225883 containerd[1658]: time="2026-01-16T18:03:48.225833935Z" level=info msg="CreateContainer within sandbox \"c753fbe6018164acdeec60a62a596185ee77b85ddc6e82e0db403e8c654a9197\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"b9f8553462735a948e96477687dc9febb6aaf68a05e7e36301b27ddf463b15a9\"" Jan 16 18:03:48.226138 containerd[1658]: time="2026-01-16T18:03:48.226109736Z" level=info msg="StartContainer for \"b9f8553462735a948e96477687dc9febb6aaf68a05e7e36301b27ddf463b15a9\"" Jan 16 18:03:48.227202 containerd[1658]: time="2026-01-16T18:03:48.227177699Z" level=info msg="connecting to shim b9f8553462735a948e96477687dc9febb6aaf68a05e7e36301b27ddf463b15a9" address="unix:///run/containerd/s/dddb982b1fd99a2f0c8b7c7189895e05c0938d13a57a539f7eeac68957f5fd49" protocol=ttrpc version=3 Jan 16 18:03:48.230630 systemd[1]: Started cri-containerd-c8ba2a8a84f78f9894479b1af1ff818b6560010f18618555ae19c84b947e6334.scope - libcontainer container c8ba2a8a84f78f9894479b1af1ff818b6560010f18618555ae19c84b947e6334. Jan 16 18:03:48.250893 systemd[1]: Started cri-containerd-5af133a973258898fbb269f2fb793401339fbb27facf0399a6d7fb6d0c7f88f9.scope - libcontainer container 5af133a973258898fbb269f2fb793401339fbb27facf0399a6d7fb6d0c7f88f9. Jan 16 18:03:48.253000 audit: BPF prog-id=258 op=LOAD Jan 16 18:03:48.254000 audit: BPF prog-id=259 op=LOAD Jan 16 18:03:48.254000 audit[5655]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2660 pid=5655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.254000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338626132613861383466373866393839343437396231616631666638 Jan 16 18:03:48.254000 audit: BPF prog-id=259 op=UNLOAD Jan 16 18:03:48.254000 audit[5655]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2660 pid=5655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.254000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338626132613861383466373866393839343437396231616631666638 Jan 16 18:03:48.254000 audit: BPF prog-id=260 op=LOAD Jan 16 18:03:48.254000 audit[5655]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2660 pid=5655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.256483 systemd[1]: Started cri-containerd-b9f8553462735a948e96477687dc9febb6aaf68a05e7e36301b27ddf463b15a9.scope - libcontainer container b9f8553462735a948e96477687dc9febb6aaf68a05e7e36301b27ddf463b15a9. Jan 16 18:03:48.254000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338626132613861383466373866393839343437396231616631666638 Jan 16 18:03:48.255000 audit: BPF prog-id=261 op=LOAD Jan 16 18:03:48.255000 audit[5655]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2660 pid=5655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.255000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338626132613861383466373866393839343437396231616631666638 Jan 16 18:03:48.255000 audit: BPF prog-id=261 op=UNLOAD Jan 16 18:03:48.255000 audit[5655]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2660 pid=5655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.255000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338626132613861383466373866393839343437396231616631666638 Jan 16 18:03:48.256000 audit: BPF prog-id=260 op=UNLOAD Jan 16 18:03:48.256000 audit[5655]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2660 pid=5655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338626132613861383466373866393839343437396231616631666638 Jan 16 18:03:48.256000 audit: BPF prog-id=262 op=LOAD Jan 16 18:03:48.256000 audit[5655]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2660 pid=5655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6338626132613861383466373866393839343437396231616631666638 Jan 16 18:03:48.267000 audit: BPF prog-id=263 op=LOAD Jan 16 18:03:48.267000 audit: BPF prog-id=264 op=LOAD Jan 16 18:03:48.267000 audit[5667]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3023 pid=5667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.267000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561663133336139373332353838393866626232363966326662373933 Jan 16 18:03:48.267000 audit: BPF prog-id=264 op=UNLOAD Jan 16 18:03:48.267000 audit[5667]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=5667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.267000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561663133336139373332353838393866626232363966326662373933 Jan 16 18:03:48.268000 audit: BPF prog-id=265 op=LOAD Jan 16 18:03:48.268000 audit[5667]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3023 pid=5667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.268000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561663133336139373332353838393866626232363966326662373933 Jan 16 18:03:48.268000 audit: BPF prog-id=266 op=LOAD Jan 16 18:03:48.268000 audit[5667]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3023 pid=5667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.268000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561663133336139373332353838393866626232363966326662373933 Jan 16 18:03:48.268000 audit: BPF prog-id=266 op=UNLOAD Jan 16 18:03:48.268000 audit[5667]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=5667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.268000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561663133336139373332353838393866626232363966326662373933 Jan 16 18:03:48.268000 audit: BPF prog-id=267 op=LOAD Jan 16 18:03:48.268000 audit: BPF prog-id=265 op=UNLOAD Jan 16 18:03:48.268000 audit[5667]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3023 pid=5667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.268000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561663133336139373332353838393866626232363966326662373933 Jan 16 18:03:48.269000 audit: BPF prog-id=268 op=LOAD Jan 16 18:03:48.269000 audit[5668]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=2631 pid=5668 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.269000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239663835353334363237333561393438653936343737363837646339 Jan 16 18:03:48.269000 audit: BPF prog-id=268 op=UNLOAD Jan 16 18:03:48.269000 audit[5668]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2631 pid=5668 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.269000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239663835353334363237333561393438653936343737363837646339 Jan 16 18:03:48.269000 audit: BPF prog-id=269 op=LOAD Jan 16 18:03:48.269000 audit[5668]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=2631 pid=5668 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.269000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239663835353334363237333561393438653936343737363837646339 Jan 16 18:03:48.269000 audit: BPF prog-id=270 op=LOAD Jan 16 18:03:48.269000 audit[5668]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=2631 pid=5668 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.269000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239663835353334363237333561393438653936343737363837646339 Jan 16 18:03:48.269000 audit: BPF prog-id=270 op=UNLOAD Jan 16 18:03:48.269000 audit[5668]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2631 pid=5668 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.269000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239663835353334363237333561393438653936343737363837646339 Jan 16 18:03:48.269000 audit: BPF prog-id=269 op=UNLOAD Jan 16 18:03:48.269000 audit[5668]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2631 pid=5668 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.269000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239663835353334363237333561393438653936343737363837646339 Jan 16 18:03:48.269000 audit: BPF prog-id=271 op=LOAD Jan 16 18:03:48.269000 audit[5667]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3023 pid=5667 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.269000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561663133336139373332353838393866626232363966326662373933 Jan 16 18:03:48.269000 audit: BPF prog-id=272 op=LOAD Jan 16 18:03:48.269000 audit[5668]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=2631 pid=5668 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:03:48.269000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239663835353334363237333561393438653936343737363837646339 Jan 16 18:03:48.294219 containerd[1658]: time="2026-01-16T18:03:48.293736381Z" level=info msg="StartContainer for \"c8ba2a8a84f78f9894479b1af1ff818b6560010f18618555ae19c84b947e6334\" returns successfully" Jan 16 18:03:48.303736 containerd[1658]: time="2026-01-16T18:03:48.303694171Z" level=info msg="StartContainer for \"5af133a973258898fbb269f2fb793401339fbb27facf0399a6d7fb6d0c7f88f9\" returns successfully" Jan 16 18:03:48.309484 containerd[1658]: time="2026-01-16T18:03:48.309221868Z" level=info msg="StartContainer for \"b9f8553462735a948e96477687dc9febb6aaf68a05e7e36301b27ddf463b15a9\" returns successfully" Jan 16 18:03:48.601876 containerd[1658]: time="2026-01-16T18:03:48.601759234Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 18:03:48.946754 containerd[1658]: time="2026-01-16T18:03:48.946562759Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:03:48.948867 containerd[1658]: time="2026-01-16T18:03:48.948819806Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 18:03:48.949049 containerd[1658]: time="2026-01-16T18:03:48.948918166Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 18:03:48.949092 kubelet[2921]: E0116 18:03:48.948985 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:03:48.949092 kubelet[2921]: E0116 18:03:48.949021 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:03:48.949379 kubelet[2921]: E0116 18:03:48.949130 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzkc8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-5cdd49cc55-ql8tz_calico-apiserver(3860e0e2-bc4a-40cc-b58d-f0b04ee81f50): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 18:03:48.950558 kubelet[2921]: E0116 18:03:48.950521 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ql8tz" podUID="3860e0e2-bc4a-40cc-b58d-f0b04ee81f50" Jan 16 18:03:51.602165 kubelet[2921]: E0116 18:03:51.602101 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-j5dxx" podUID="64da405a-12aa-44ec-8b4d-a44866f591ec" Jan 16 18:03:54.428462 update_engine[1645]: I20260116 18:03:54.428315 1645 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 16 18:03:54.429021 update_engine[1645]: I20260116 18:03:54.428494 1645 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 16 18:03:54.429021 update_engine[1645]: I20260116 18:03:54.429010 1645 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 16 18:03:54.438623 update_engine[1645]: E20260116 18:03:54.438576 1645 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 16 18:03:54.438684 update_engine[1645]: I20260116 18:03:54.438649 1645 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jan 16 18:03:55.602199 kubelet[2921]: E0116 18:03:55.602134 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5cdd49cc55-ck2gc" podUID="2d37882f-c058-4d23-87c0-40e1fbaf0de7" Jan 16 18:03:56.531351 kubelet[2921]: I0116 18:03:56.531308 2921 status_manager.go:890] "Failed to get status for pod" podUID="19ae2064-76f2-47ed-9968-9de33cfc7702" pod="calico-system/whisker-55c9fc4598-2nzbk" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.7.62:60540->10.0.7.107:2379: read: connection timed out" Jan 16 18:03:56.602601 containerd[1658]: time="2026-01-16T18:03:56.602556774Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 16 18:03:56.940672 containerd[1658]: time="2026-01-16T18:03:56.940565958Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:03:56.943490 containerd[1658]: time="2026-01-16T18:03:56.943441047Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 16 18:03:56.943562 containerd[1658]: time="2026-01-16T18:03:56.943497687Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 16 18:03:56.943729 kubelet[2921]: E0116 18:03:56.943638 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 18:03:56.943729 kubelet[2921]: E0116 18:03:56.943695 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 18:03:56.944155 kubelet[2921]: E0116 18:03:56.943862 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-84qv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-7b6469dfdd-hzhpv_calico-system(cbb94aae-850a-4651-be6c-0c622d131a34): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 16 18:03:56.945110 kubelet[2921]: E0116 18:03:56.945062 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7b6469dfdd-hzhpv" podUID="cbb94aae-850a-4651-be6c-0c622d131a34" Jan 16 18:03:57.602455 containerd[1658]: time="2026-01-16T18:03:57.602393524Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 16 18:03:57.893987 kubelet[2921]: E0116 18:03:57.893856 2921 controller.go:195] "Failed to update lease" err="Put \"https://10.0.7.62:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-7f6b5ebc40?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 16 18:03:57.971817 containerd[1658]: time="2026-01-16T18:03:57.971750523Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:03:57.974062 containerd[1658]: time="2026-01-16T18:03:57.974024090Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 16 18:03:57.974138 containerd[1658]: time="2026-01-16T18:03:57.974106050Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 16 18:03:57.974272 kubelet[2921]: E0116 18:03:57.974234 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 18:03:57.974489 kubelet[2921]: E0116 18:03:57.974286 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 18:03:57.974489 kubelet[2921]: E0116 18:03:57.974393 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-csxnn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rb578_calico-system(08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 16 18:03:57.976906 containerd[1658]: time="2026-01-16T18:03:57.976880698Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 16 18:03:58.172505 kernel: pcieport 0000:00:01.0: pciehp: Slot(0): Button press: will power off in 5 sec Jan 16 18:03:58.315569 containerd[1658]: time="2026-01-16T18:03:58.315497924Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:03:58.322112 containerd[1658]: time="2026-01-16T18:03:58.321896744Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 16 18:03:58.322112 containerd[1658]: time="2026-01-16T18:03:58.322019544Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 16 18:03:58.322378 kubelet[2921]: E0116 18:03:58.322328 2921 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 18:03:58.322435 kubelet[2921]: E0116 18:03:58.322386 2921 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 18:03:58.322580 kubelet[2921]: E0116 18:03:58.322543 2921 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-csxnn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-rb578_calico-system(08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 16 18:03:58.323875 kubelet[2921]: E0116 18:03:58.323734 2921 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-rb578" podUID="08cc5ff3-c204-4b6d-8d83-d5c9e19e6ce3"