Jan 14 01:40:07.448512 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jan 14 01:40:07.448535 kernel: Linux version 6.12.65-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT Tue Jan 13 21:43:11 -00 2026 Jan 14 01:40:07.448545 kernel: KASLR enabled Jan 14 01:40:07.448551 kernel: efi: EFI v2.7 by EDK II Jan 14 01:40:07.448557 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438357218 Jan 14 01:40:07.448563 kernel: random: crng init done Jan 14 01:40:07.448570 kernel: secureboot: Secure boot disabled Jan 14 01:40:07.448576 kernel: ACPI: Early table checksum verification disabled Jan 14 01:40:07.448582 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Jan 14 01:40:07.448589 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Jan 14 01:40:07.448596 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:40:07.448602 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:40:07.448608 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:40:07.448614 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:40:07.448623 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:40:07.448629 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:40:07.448636 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:40:07.448642 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:40:07.448649 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:40:07.448655 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 01:40:07.448662 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Jan 14 01:40:07.448668 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jan 14 01:40:07.448675 kernel: ACPI: Use ACPI SPCR as default console: Yes Jan 14 01:40:07.448682 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Jan 14 01:40:07.448689 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Jan 14 01:40:07.448695 kernel: Zone ranges: Jan 14 01:40:07.448701 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jan 14 01:40:07.448708 kernel: DMA32 empty Jan 14 01:40:07.448714 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Jan 14 01:40:07.448742 kernel: Device empty Jan 14 01:40:07.448749 kernel: Movable zone start for each node Jan 14 01:40:07.448755 kernel: Early memory node ranges Jan 14 01:40:07.448762 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Jan 14 01:40:07.448768 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Jan 14 01:40:07.448775 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Jan 14 01:40:07.448783 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Jan 14 01:40:07.448789 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Jan 14 01:40:07.448796 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Jan 14 01:40:07.448803 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Jan 14 01:40:07.448809 kernel: psci: probing for conduit method from ACPI. Jan 14 01:40:07.448818 kernel: psci: PSCIv1.3 detected in firmware. Jan 14 01:40:07.448827 kernel: psci: Using standard PSCI v0.2 function IDs Jan 14 01:40:07.448833 kernel: psci: Trusted OS migration not required Jan 14 01:40:07.448840 kernel: psci: SMC Calling Convention v1.1 Jan 14 01:40:07.448847 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jan 14 01:40:07.448854 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Jan 14 01:40:07.448861 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Jan 14 01:40:07.448867 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Jan 14 01:40:07.448874 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Jan 14 01:40:07.448887 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jan 14 01:40:07.448895 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jan 14 01:40:07.448902 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Jan 14 01:40:07.448908 kernel: Detected PIPT I-cache on CPU0 Jan 14 01:40:07.448915 kernel: CPU features: detected: GIC system register CPU interface Jan 14 01:40:07.448922 kernel: CPU features: detected: Spectre-v4 Jan 14 01:40:07.448929 kernel: CPU features: detected: Spectre-BHB Jan 14 01:40:07.448936 kernel: CPU features: kernel page table isolation forced ON by KASLR Jan 14 01:40:07.448943 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jan 14 01:40:07.448950 kernel: CPU features: detected: ARM erratum 1418040 Jan 14 01:40:07.448957 kernel: CPU features: detected: SSBS not fully self-synchronizing Jan 14 01:40:07.448965 kernel: alternatives: applying boot alternatives Jan 14 01:40:07.448974 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=a2e92265a189403c21ae2a2ae9e6d4fed0782e0e430fbcb369a7bb0db156274f Jan 14 01:40:07.448981 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Jan 14 01:40:07.448988 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Jan 14 01:40:07.448995 kernel: Fallback order for Node 0: 0 Jan 14 01:40:07.449002 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Jan 14 01:40:07.449008 kernel: Policy zone: Normal Jan 14 01:40:07.449015 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 14 01:40:07.449022 kernel: software IO TLB: area num 4. Jan 14 01:40:07.449029 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Jan 14 01:40:07.449037 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Jan 14 01:40:07.449044 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 14 01:40:07.449051 kernel: rcu: RCU event tracing is enabled. Jan 14 01:40:07.449059 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Jan 14 01:40:07.449066 kernel: Trampoline variant of Tasks RCU enabled. Jan 14 01:40:07.449072 kernel: Tracing variant of Tasks RCU enabled. Jan 14 01:40:07.449079 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 14 01:40:07.449086 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Jan 14 01:40:07.449094 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 14 01:40:07.449100 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 14 01:40:07.449107 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jan 14 01:40:07.449116 kernel: GICv3: 256 SPIs implemented Jan 14 01:40:07.449126 kernel: GICv3: 0 Extended SPIs implemented Jan 14 01:40:07.449133 kernel: Root IRQ handler: gic_handle_irq Jan 14 01:40:07.449139 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jan 14 01:40:07.449146 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jan 14 01:40:07.449153 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jan 14 01:40:07.449160 kernel: ITS [mem 0x08080000-0x0809ffff] Jan 14 01:40:07.449167 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Jan 14 01:40:07.449174 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Jan 14 01:40:07.449181 kernel: GICv3: using LPI property table @0x0000000100130000 Jan 14 01:40:07.449187 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Jan 14 01:40:07.449194 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 14 01:40:07.449203 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 14 01:40:07.449210 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jan 14 01:40:07.449217 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jan 14 01:40:07.449224 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jan 14 01:40:07.449231 kernel: arm-pv: using stolen time PV Jan 14 01:40:07.449238 kernel: Console: colour dummy device 80x25 Jan 14 01:40:07.449245 kernel: ACPI: Core revision 20240827 Jan 14 01:40:07.449253 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jan 14 01:40:07.449262 kernel: pid_max: default: 32768 minimum: 301 Jan 14 01:40:07.449269 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 14 01:40:07.449276 kernel: landlock: Up and running. Jan 14 01:40:07.449283 kernel: SELinux: Initializing. Jan 14 01:40:07.449290 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 14 01:40:07.449298 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 14 01:40:07.449305 kernel: rcu: Hierarchical SRCU implementation. Jan 14 01:40:07.449312 kernel: rcu: Max phase no-delay instances is 400. Jan 14 01:40:07.449321 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 14 01:40:07.449328 kernel: Remapping and enabling EFI services. Jan 14 01:40:07.449335 kernel: smp: Bringing up secondary CPUs ... Jan 14 01:40:07.449342 kernel: Detected PIPT I-cache on CPU1 Jan 14 01:40:07.449350 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jan 14 01:40:07.449357 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Jan 14 01:40:07.449364 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 14 01:40:07.449373 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jan 14 01:40:07.449380 kernel: Detected PIPT I-cache on CPU2 Jan 14 01:40:07.449392 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Jan 14 01:40:07.449401 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Jan 14 01:40:07.449408 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 14 01:40:07.449416 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Jan 14 01:40:07.449423 kernel: Detected PIPT I-cache on CPU3 Jan 14 01:40:07.449430 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Jan 14 01:40:07.449439 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Jan 14 01:40:07.449447 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 14 01:40:07.449454 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Jan 14 01:40:07.449461 kernel: smp: Brought up 1 node, 4 CPUs Jan 14 01:40:07.449469 kernel: SMP: Total of 4 processors activated. Jan 14 01:40:07.449476 kernel: CPU: All CPU(s) started at EL1 Jan 14 01:40:07.449485 kernel: CPU features: detected: 32-bit EL0 Support Jan 14 01:40:07.449492 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jan 14 01:40:07.449500 kernel: CPU features: detected: Common not Private translations Jan 14 01:40:07.449507 kernel: CPU features: detected: CRC32 instructions Jan 14 01:40:07.449514 kernel: CPU features: detected: Enhanced Virtualization Traps Jan 14 01:40:07.449522 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jan 14 01:40:07.449529 kernel: CPU features: detected: LSE atomic instructions Jan 14 01:40:07.449538 kernel: CPU features: detected: Privileged Access Never Jan 14 01:40:07.449546 kernel: CPU features: detected: RAS Extension Support Jan 14 01:40:07.449553 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jan 14 01:40:07.449561 kernel: alternatives: applying system-wide alternatives Jan 14 01:40:07.449568 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Jan 14 01:40:07.449576 kernel: Memory: 16324368K/16777216K available (11200K kernel code, 2458K rwdata, 9092K rodata, 12480K init, 1038K bss, 430064K reserved, 16384K cma-reserved) Jan 14 01:40:07.449584 kernel: devtmpfs: initialized Jan 14 01:40:07.449592 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 14 01:40:07.449601 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Jan 14 01:40:07.449609 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jan 14 01:40:07.449616 kernel: 0 pages in range for non-PLT usage Jan 14 01:40:07.449624 kernel: 515152 pages in range for PLT usage Jan 14 01:40:07.449631 kernel: pinctrl core: initialized pinctrl subsystem Jan 14 01:40:07.449639 kernel: SMBIOS 3.0.0 present. Jan 14 01:40:07.449646 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Jan 14 01:40:07.449655 kernel: DMI: Memory slots populated: 1/1 Jan 14 01:40:07.449662 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 14 01:40:07.449670 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Jan 14 01:40:07.449677 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jan 14 01:40:07.449685 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jan 14 01:40:07.449692 kernel: audit: initializing netlink subsys (disabled) Jan 14 01:40:07.449700 kernel: audit: type=2000 audit(0.038:1): state=initialized audit_enabled=0 res=1 Jan 14 01:40:07.449709 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 14 01:40:07.449723 kernel: cpuidle: using governor menu Jan 14 01:40:07.449732 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jan 14 01:40:07.449739 kernel: ASID allocator initialised with 32768 entries Jan 14 01:40:07.449747 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 14 01:40:07.449754 kernel: Serial: AMBA PL011 UART driver Jan 14 01:40:07.449762 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 14 01:40:07.449771 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jan 14 01:40:07.449779 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jan 14 01:40:07.449786 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jan 14 01:40:07.449794 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 14 01:40:07.449801 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jan 14 01:40:07.449809 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jan 14 01:40:07.449817 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jan 14 01:40:07.449825 kernel: ACPI: Added _OSI(Module Device) Jan 14 01:40:07.449833 kernel: ACPI: Added _OSI(Processor Device) Jan 14 01:40:07.449840 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 14 01:40:07.449848 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 14 01:40:07.449855 kernel: ACPI: Interpreter enabled Jan 14 01:40:07.449863 kernel: ACPI: Using GIC for interrupt routing Jan 14 01:40:07.449870 kernel: ACPI: MCFG table detected, 1 entries Jan 14 01:40:07.449878 kernel: ACPI: CPU0 has been hot-added Jan 14 01:40:07.449886 kernel: ACPI: CPU1 has been hot-added Jan 14 01:40:07.449894 kernel: ACPI: CPU2 has been hot-added Jan 14 01:40:07.449901 kernel: ACPI: CPU3 has been hot-added Jan 14 01:40:07.449909 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jan 14 01:40:07.449916 kernel: printk: legacy console [ttyAMA0] enabled Jan 14 01:40:07.449924 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 14 01:40:07.450082 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 14 01:40:07.450173 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 14 01:40:07.450277 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 14 01:40:07.450362 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jan 14 01:40:07.450442 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jan 14 01:40:07.450452 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jan 14 01:40:07.450459 kernel: PCI host bridge to bus 0000:00 Jan 14 01:40:07.450547 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jan 14 01:40:07.450623 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jan 14 01:40:07.450698 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jan 14 01:40:07.450805 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 14 01:40:07.450908 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jan 14 01:40:07.451021 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.451121 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Jan 14 01:40:07.451207 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 14 01:40:07.451287 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Jan 14 01:40:07.451367 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Jan 14 01:40:07.451736 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.451842 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Jan 14 01:40:07.451925 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 14 01:40:07.452005 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Jan 14 01:40:07.452100 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.452191 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Jan 14 01:40:07.452276 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 14 01:40:07.452354 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Jan 14 01:40:07.452438 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Jan 14 01:40:07.452526 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.452606 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Jan 14 01:40:07.452685 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 14 01:40:07.452786 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Jan 14 01:40:07.452875 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.452957 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Jan 14 01:40:07.453036 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 14 01:40:07.453126 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Jan 14 01:40:07.453208 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Jan 14 01:40:07.453308 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.453791 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Jan 14 01:40:07.453907 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 14 01:40:07.454001 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Jan 14 01:40:07.454080 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Jan 14 01:40:07.454168 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.454272 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Jan 14 01:40:07.454355 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 14 01:40:07.454442 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.454523 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Jan 14 01:40:07.454603 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 14 01:40:07.454704 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.454813 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Jan 14 01:40:07.454895 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 14 01:40:07.454984 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.455075 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Jan 14 01:40:07.455160 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 14 01:40:07.455263 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.455347 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Jan 14 01:40:07.455425 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 14 01:40:07.455528 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.455609 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Jan 14 01:40:07.455691 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 14 01:40:07.455808 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.455895 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Jan 14 01:40:07.455975 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 14 01:40:07.456061 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.456143 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Jan 14 01:40:07.456235 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 14 01:40:07.456323 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.456403 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Jan 14 01:40:07.456499 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 14 01:40:07.456592 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.456677 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Jan 14 01:40:07.456773 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 14 01:40:07.456866 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.456947 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Jan 14 01:40:07.457026 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 14 01:40:07.457112 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.457195 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Jan 14 01:40:07.457273 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 14 01:40:07.457351 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Jan 14 01:40:07.457442 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Jan 14 01:40:07.457532 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.457616 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Jan 14 01:40:07.457701 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 14 01:40:07.457850 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Jan 14 01:40:07.457944 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Jan 14 01:40:07.458041 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.458122 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Jan 14 01:40:07.458201 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 14 01:40:07.458302 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Jan 14 01:40:07.458382 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Jan 14 01:40:07.458473 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.458555 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Jan 14 01:40:07.458633 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 14 01:40:07.458732 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Jan 14 01:40:07.458824 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Jan 14 01:40:07.458912 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.458993 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Jan 14 01:40:07.459073 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 14 01:40:07.459164 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Jan 14 01:40:07.459247 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Jan 14 01:40:07.459338 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.459419 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Jan 14 01:40:07.459499 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 14 01:40:07.459579 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Jan 14 01:40:07.459658 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Jan 14 01:40:07.459770 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.459857 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Jan 14 01:40:07.459937 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 14 01:40:07.460018 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Jan 14 01:40:07.460107 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Jan 14 01:40:07.460194 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.460274 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Jan 14 01:40:07.460356 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 14 01:40:07.460435 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Jan 14 01:40:07.460513 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Jan 14 01:40:07.460598 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.460678 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Jan 14 01:40:07.460771 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 14 01:40:07.460852 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Jan 14 01:40:07.460931 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Jan 14 01:40:07.461016 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.461096 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Jan 14 01:40:07.461173 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 14 01:40:07.461254 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Jan 14 01:40:07.461332 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Jan 14 01:40:07.461430 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.461512 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Jan 14 01:40:07.461595 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 14 01:40:07.461674 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Jan 14 01:40:07.461761 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Jan 14 01:40:07.461857 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.461938 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Jan 14 01:40:07.462016 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 14 01:40:07.462097 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Jan 14 01:40:07.462175 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Jan 14 01:40:07.462278 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.462369 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Jan 14 01:40:07.462449 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 14 01:40:07.462529 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Jan 14 01:40:07.462611 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Jan 14 01:40:07.462699 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.462817 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Jan 14 01:40:07.462908 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 14 01:40:07.462989 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Jan 14 01:40:07.463068 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Jan 14 01:40:07.463161 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.463242 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Jan 14 01:40:07.463322 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 14 01:40:07.463401 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Jan 14 01:40:07.463480 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Jan 14 01:40:07.463566 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 01:40:07.463655 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Jan 14 01:40:07.463753 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 14 01:40:07.463837 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Jan 14 01:40:07.463917 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Jan 14 01:40:07.464006 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 14 01:40:07.464097 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Jan 14 01:40:07.464187 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jan 14 01:40:07.464269 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 14 01:40:07.464361 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jan 14 01:40:07.464444 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Jan 14 01:40:07.464542 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 14 01:40:07.464641 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Jan 14 01:40:07.464758 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Jan 14 01:40:07.464859 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 14 01:40:07.464956 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Jan 14 01:40:07.465067 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 14 01:40:07.465160 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Jan 14 01:40:07.465257 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Jan 14 01:40:07.465350 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Jan 14 01:40:07.465433 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Jan 14 01:40:07.465530 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Jan 14 01:40:07.465620 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jan 14 01:40:07.465707 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jan 14 01:40:07.465815 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jan 14 01:40:07.465902 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jan 14 01:40:07.465982 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jan 14 01:40:07.466066 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jan 14 01:40:07.466153 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 14 01:40:07.466256 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jan 14 01:40:07.466343 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jan 14 01:40:07.466435 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 14 01:40:07.466521 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jan 14 01:40:07.466602 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jan 14 01:40:07.466691 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 14 01:40:07.466797 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jan 14 01:40:07.466895 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Jan 14 01:40:07.466981 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 14 01:40:07.467064 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jan 14 01:40:07.467150 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jan 14 01:40:07.467235 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 14 01:40:07.467314 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Jan 14 01:40:07.467393 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Jan 14 01:40:07.467486 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 14 01:40:07.467579 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jan 14 01:40:07.467660 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jan 14 01:40:07.467767 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 14 01:40:07.467852 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jan 14 01:40:07.467932 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jan 14 01:40:07.468020 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 14 01:40:07.468111 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Jan 14 01:40:07.468192 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Jan 14 01:40:07.468276 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 14 01:40:07.468356 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 14 01:40:07.468435 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Jan 14 01:40:07.468528 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 14 01:40:07.468608 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Jan 14 01:40:07.468696 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Jan 14 01:40:07.468804 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 14 01:40:07.468885 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Jan 14 01:40:07.468965 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Jan 14 01:40:07.469062 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 14 01:40:07.469144 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Jan 14 01:40:07.469225 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Jan 14 01:40:07.469310 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 14 01:40:07.469396 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Jan 14 01:40:07.469483 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Jan 14 01:40:07.469567 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 14 01:40:07.469649 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Jan 14 01:40:07.469742 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Jan 14 01:40:07.469830 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 14 01:40:07.469915 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Jan 14 01:40:07.469998 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Jan 14 01:40:07.470084 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 14 01:40:07.470166 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Jan 14 01:40:07.470263 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Jan 14 01:40:07.470351 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 14 01:40:07.470436 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Jan 14 01:40:07.470516 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Jan 14 01:40:07.470601 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 14 01:40:07.470681 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Jan 14 01:40:07.470773 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Jan 14 01:40:07.470858 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 14 01:40:07.470951 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Jan 14 01:40:07.471037 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Jan 14 01:40:07.471123 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 14 01:40:07.471203 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Jan 14 01:40:07.471283 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Jan 14 01:40:07.471372 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 14 01:40:07.471452 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Jan 14 01:40:07.471532 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Jan 14 01:40:07.471615 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 14 01:40:07.471695 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Jan 14 01:40:07.471783 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Jan 14 01:40:07.471875 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 14 01:40:07.471960 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Jan 14 01:40:07.472041 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Jan 14 01:40:07.472126 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 14 01:40:07.472207 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Jan 14 01:40:07.472286 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Jan 14 01:40:07.472373 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 14 01:40:07.472453 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Jan 14 01:40:07.472531 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Jan 14 01:40:07.472614 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 14 01:40:07.472694 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Jan 14 01:40:07.472787 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Jan 14 01:40:07.472872 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 14 01:40:07.472952 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Jan 14 01:40:07.473033 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Jan 14 01:40:07.473116 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 14 01:40:07.473196 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Jan 14 01:40:07.473277 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Jan 14 01:40:07.473362 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 14 01:40:07.473443 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Jan 14 01:40:07.473532 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Jan 14 01:40:07.473617 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 14 01:40:07.473699 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Jan 14 01:40:07.473802 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Jan 14 01:40:07.473887 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 14 01:40:07.473968 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Jan 14 01:40:07.474047 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Jan 14 01:40:07.474129 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Jan 14 01:40:07.474212 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Jan 14 01:40:07.474308 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Jan 14 01:40:07.474389 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Jan 14 01:40:07.474470 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Jan 14 01:40:07.474561 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Jan 14 01:40:07.474644 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Jan 14 01:40:07.474744 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Jan 14 01:40:07.474840 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Jan 14 01:40:07.474921 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Jan 14 01:40:07.475002 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Jan 14 01:40:07.475082 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Jan 14 01:40:07.475163 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Jan 14 01:40:07.475243 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Jan 14 01:40:07.475326 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Jan 14 01:40:07.475405 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Jan 14 01:40:07.475485 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Jan 14 01:40:07.475567 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Jan 14 01:40:07.475648 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Jan 14 01:40:07.475739 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Jan 14 01:40:07.475824 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Jan 14 01:40:07.475904 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Jan 14 01:40:07.475985 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Jan 14 01:40:07.476065 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Jan 14 01:40:07.476146 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Jan 14 01:40:07.476225 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Jan 14 01:40:07.476305 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Jan 14 01:40:07.476387 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Jan 14 01:40:07.476467 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Jan 14 01:40:07.476546 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Jan 14 01:40:07.476627 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Jan 14 01:40:07.476724 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Jan 14 01:40:07.476809 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Jan 14 01:40:07.476896 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Jan 14 01:40:07.476978 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Jan 14 01:40:07.477058 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Jan 14 01:40:07.477138 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Jan 14 01:40:07.477219 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Jan 14 01:40:07.477299 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Jan 14 01:40:07.477380 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Jan 14 01:40:07.477461 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Jan 14 01:40:07.477540 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Jan 14 01:40:07.477621 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Jan 14 01:40:07.477700 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Jan 14 01:40:07.477790 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Jan 14 01:40:07.477870 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Jan 14 01:40:07.477954 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Jan 14 01:40:07.478033 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Jan 14 01:40:07.478114 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Jan 14 01:40:07.478193 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Jan 14 01:40:07.478298 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Jan 14 01:40:07.478379 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Jan 14 01:40:07.478462 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Jan 14 01:40:07.478543 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Jan 14 01:40:07.478625 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Jan 14 01:40:07.478706 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Jan 14 01:40:07.478798 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Jan 14 01:40:07.478880 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Jan 14 01:40:07.478964 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Jan 14 01:40:07.479043 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Jan 14 01:40:07.479124 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Jan 14 01:40:07.479204 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Jan 14 01:40:07.479285 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Jan 14 01:40:07.479364 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Jan 14 01:40:07.479443 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Jan 14 01:40:07.479524 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Jan 14 01:40:07.479605 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Jan 14 01:40:07.479684 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Jan 14 01:40:07.479781 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Jan 14 01:40:07.479872 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Jan 14 01:40:07.479956 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Jan 14 01:40:07.480046 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Jan 14 01:40:07.480131 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Jan 14 01:40:07.480212 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Jan 14 01:40:07.480292 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Jan 14 01:40:07.480373 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Jan 14 01:40:07.480460 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Jan 14 01:40:07.480541 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Jan 14 01:40:07.480622 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Jan 14 01:40:07.480701 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Jan 14 01:40:07.480799 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Jan 14 01:40:07.480880 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Jan 14 01:40:07.480961 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Jan 14 01:40:07.481041 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Jan 14 01:40:07.481124 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Jan 14 01:40:07.481203 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Jan 14 01:40:07.481284 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Jan 14 01:40:07.481363 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Jan 14 01:40:07.481443 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Jan 14 01:40:07.481522 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Jan 14 01:40:07.481604 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Jan 14 01:40:07.481683 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Jan 14 01:40:07.481778 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Jan 14 01:40:07.481859 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Jan 14 01:40:07.481941 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Jan 14 01:40:07.482023 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Jan 14 01:40:07.482104 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Jan 14 01:40:07.482185 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.482289 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.482370 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Jan 14 01:40:07.482450 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.482532 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.482612 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Jan 14 01:40:07.482691 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.482789 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.482880 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Jan 14 01:40:07.482962 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.483045 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.483126 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Jan 14 01:40:07.483210 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.483290 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.483372 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Jan 14 01:40:07.483452 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.483533 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.483619 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Jan 14 01:40:07.483699 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.483789 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.483874 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Jan 14 01:40:07.483955 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.484034 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.484118 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Jan 14 01:40:07.484199 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.484278 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.484359 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Jan 14 01:40:07.484438 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.484518 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.484600 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Jan 14 01:40:07.484680 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.484777 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.484867 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Jan 14 01:40:07.484954 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.485035 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.485122 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Jan 14 01:40:07.485206 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.485285 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.485365 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Jan 14 01:40:07.485452 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.485533 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.485614 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Jan 14 01:40:07.485703 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.485794 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.485876 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Jan 14 01:40:07.485956 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.486036 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.486119 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Jan 14 01:40:07.486206 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.486307 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.486389 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Jan 14 01:40:07.486479 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.486559 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.486645 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Jan 14 01:40:07.486739 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Jan 14 01:40:07.486827 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Jan 14 01:40:07.486909 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Jan 14 01:40:07.487000 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Jan 14 01:40:07.487088 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Jan 14 01:40:07.487170 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Jan 14 01:40:07.487252 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Jan 14 01:40:07.487333 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Jan 14 01:40:07.487419 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Jan 14 01:40:07.487499 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Jan 14 01:40:07.487591 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Jan 14 01:40:07.487679 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Jan 14 01:40:07.487775 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Jan 14 01:40:07.487866 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Jan 14 01:40:07.487948 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.488032 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.488119 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.488200 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.488280 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.488360 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.488441 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.488521 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.488606 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.488685 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.488786 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.488870 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.488950 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.489037 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.489121 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.489204 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.489284 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.489363 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.489445 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.489524 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.489606 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.489690 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.489791 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.489873 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.489962 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.490051 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.490142 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.490241 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.490328 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.490409 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.490492 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.490572 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.490656 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.490753 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.490837 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Jan 14 01:40:07.490918 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Jan 14 01:40:07.491009 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Jan 14 01:40:07.491094 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jan 14 01:40:07.491175 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Jan 14 01:40:07.491255 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 14 01:40:07.491336 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Jan 14 01:40:07.491418 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jan 14 01:40:07.491505 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Jan 14 01:40:07.491585 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 14 01:40:07.491668 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Jan 14 01:40:07.491758 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jan 14 01:40:07.491848 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Jan 14 01:40:07.491943 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Jan 14 01:40:07.492028 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 14 01:40:07.492107 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Jan 14 01:40:07.492188 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jan 14 01:40:07.492276 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Jan 14 01:40:07.492361 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 14 01:40:07.492444 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Jan 14 01:40:07.492523 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jan 14 01:40:07.492615 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Jan 14 01:40:07.492697 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Jan 14 01:40:07.492784 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 14 01:40:07.492865 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Jan 14 01:40:07.492945 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jan 14 01:40:07.493038 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Jan 14 01:40:07.493121 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Jan 14 01:40:07.493203 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 14 01:40:07.493282 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Jan 14 01:40:07.493367 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 14 01:40:07.493448 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 14 01:40:07.493528 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Jan 14 01:40:07.493610 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 14 01:40:07.493708 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 14 01:40:07.493839 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Jan 14 01:40:07.493921 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 14 01:40:07.494002 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 14 01:40:07.494089 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Jan 14 01:40:07.494177 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jan 14 01:40:07.494277 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 14 01:40:07.494360 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Jan 14 01:40:07.494440 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Jan 14 01:40:07.494522 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 14 01:40:07.494603 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Jan 14 01:40:07.494693 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Jan 14 01:40:07.494802 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 14 01:40:07.494885 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Jan 14 01:40:07.494965 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Jan 14 01:40:07.495049 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 14 01:40:07.495132 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Jan 14 01:40:07.495219 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Jan 14 01:40:07.495301 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 14 01:40:07.495388 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Jan 14 01:40:07.495468 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 14 01:40:07.495556 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 14 01:40:07.495640 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Jan 14 01:40:07.495729 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 14 01:40:07.495819 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 14 01:40:07.495901 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Jan 14 01:40:07.495981 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 14 01:40:07.496065 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 14 01:40:07.496145 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Jan 14 01:40:07.496224 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Jan 14 01:40:07.496307 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 14 01:40:07.496400 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Jan 14 01:40:07.496480 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Jan 14 01:40:07.496566 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 14 01:40:07.496666 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Jan 14 01:40:07.496766 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Jan 14 01:40:07.496857 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Jan 14 01:40:07.496940 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 14 01:40:07.497020 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Jan 14 01:40:07.497123 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Jan 14 01:40:07.497208 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Jan 14 01:40:07.497291 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 14 01:40:07.497372 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Jan 14 01:40:07.497472 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Jan 14 01:40:07.497555 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Jan 14 01:40:07.497636 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 14 01:40:07.497739 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Jan 14 01:40:07.497824 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Jan 14 01:40:07.497906 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 14 01:40:07.497988 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 14 01:40:07.498067 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Jan 14 01:40:07.498156 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Jan 14 01:40:07.498259 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 14 01:40:07.498349 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 14 01:40:07.498432 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Jan 14 01:40:07.498515 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Jan 14 01:40:07.498595 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 14 01:40:07.498676 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 14 01:40:07.498770 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Jan 14 01:40:07.498856 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Jan 14 01:40:07.498936 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Jan 14 01:40:07.499019 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 14 01:40:07.499099 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Jan 14 01:40:07.499187 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Jan 14 01:40:07.499275 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Jan 14 01:40:07.499369 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 14 01:40:07.499455 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Jan 14 01:40:07.499536 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Jan 14 01:40:07.499617 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Jan 14 01:40:07.499698 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 14 01:40:07.499799 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Jan 14 01:40:07.499881 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Jan 14 01:40:07.499960 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Jan 14 01:40:07.500044 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 14 01:40:07.500125 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Jan 14 01:40:07.500206 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Jan 14 01:40:07.500285 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Jan 14 01:40:07.500367 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 14 01:40:07.500446 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Jan 14 01:40:07.500527 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Jan 14 01:40:07.500606 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 14 01:40:07.500687 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 14 01:40:07.500812 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Jan 14 01:40:07.500896 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Jan 14 01:40:07.500975 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 14 01:40:07.501058 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 14 01:40:07.501149 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Jan 14 01:40:07.501231 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Jan 14 01:40:07.501310 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 14 01:40:07.501392 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 14 01:40:07.501471 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Jan 14 01:40:07.501551 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Jan 14 01:40:07.501629 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Jan 14 01:40:07.501728 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jan 14 01:40:07.501812 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jan 14 01:40:07.501896 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jan 14 01:40:07.501983 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jan 14 01:40:07.502059 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jan 14 01:40:07.502144 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jan 14 01:40:07.502230 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jan 14 01:40:07.502319 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jan 14 01:40:07.502394 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jan 14 01:40:07.502475 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jan 14 01:40:07.502553 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jan 14 01:40:07.502633 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jan 14 01:40:07.502708 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jan 14 01:40:07.502806 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jan 14 01:40:07.502885 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 14 01:40:07.502976 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jan 14 01:40:07.503066 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 14 01:40:07.503151 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jan 14 01:40:07.503227 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 14 01:40:07.503308 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jan 14 01:40:07.503383 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jan 14 01:40:07.503467 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Jan 14 01:40:07.503542 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Jan 14 01:40:07.503637 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Jan 14 01:40:07.503714 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Jan 14 01:40:07.503811 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Jan 14 01:40:07.503890 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Jan 14 01:40:07.503971 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Jan 14 01:40:07.504046 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Jan 14 01:40:07.504127 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Jan 14 01:40:07.504201 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 14 01:40:07.504300 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Jan 14 01:40:07.504385 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 14 01:40:07.504475 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Jan 14 01:40:07.504551 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 14 01:40:07.504633 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Jan 14 01:40:07.504711 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Jan 14 01:40:07.504821 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Jan 14 01:40:07.504903 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Jan 14 01:40:07.504989 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Jan 14 01:40:07.505064 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Jan 14 01:40:07.505141 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Jan 14 01:40:07.505222 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Jan 14 01:40:07.505297 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Jan 14 01:40:07.505371 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Jan 14 01:40:07.505451 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Jan 14 01:40:07.505528 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Jan 14 01:40:07.505604 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Jan 14 01:40:07.505686 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Jan 14 01:40:07.505788 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Jan 14 01:40:07.505866 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 14 01:40:07.505948 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Jan 14 01:40:07.506025 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Jan 14 01:40:07.506099 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 14 01:40:07.506179 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Jan 14 01:40:07.506275 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Jan 14 01:40:07.506359 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 14 01:40:07.506458 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Jan 14 01:40:07.506538 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Jan 14 01:40:07.506612 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Jan 14 01:40:07.506699 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Jan 14 01:40:07.506800 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Jan 14 01:40:07.506886 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Jan 14 01:40:07.506969 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 14 01:40:07.507048 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Jan 14 01:40:07.507122 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Jan 14 01:40:07.507215 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Jan 14 01:40:07.507295 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Jan 14 01:40:07.507375 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Jan 14 01:40:07.507462 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Jan 14 01:40:07.507537 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Jan 14 01:40:07.507611 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Jan 14 01:40:07.507691 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Jan 14 01:40:07.507793 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Jan 14 01:40:07.507871 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 14 01:40:07.507962 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Jan 14 01:40:07.508040 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Jan 14 01:40:07.508114 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 14 01:40:07.508195 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Jan 14 01:40:07.508270 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Jan 14 01:40:07.508346 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 14 01:40:07.508431 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Jan 14 01:40:07.508506 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Jan 14 01:40:07.508580 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Jan 14 01:40:07.508590 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jan 14 01:40:07.508598 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jan 14 01:40:07.508607 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jan 14 01:40:07.508617 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jan 14 01:40:07.508625 kernel: iommu: Default domain type: Translated Jan 14 01:40:07.508634 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jan 14 01:40:07.508642 kernel: efivars: Registered efivars operations Jan 14 01:40:07.508650 kernel: vgaarb: loaded Jan 14 01:40:07.508658 kernel: clocksource: Switched to clocksource arch_sys_counter Jan 14 01:40:07.508666 kernel: VFS: Disk quotas dquot_6.6.0 Jan 14 01:40:07.508676 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 14 01:40:07.508684 kernel: pnp: PnP ACPI init Jan 14 01:40:07.508804 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jan 14 01:40:07.508817 kernel: pnp: PnP ACPI: found 1 devices Jan 14 01:40:07.508825 kernel: NET: Registered PF_INET protocol family Jan 14 01:40:07.508834 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 14 01:40:07.508845 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Jan 14 01:40:07.508853 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 14 01:40:07.508861 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 14 01:40:07.508869 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Jan 14 01:40:07.508877 kernel: TCP: Hash tables configured (established 131072 bind 65536) Jan 14 01:40:07.508885 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 14 01:40:07.508894 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 14 01:40:07.508904 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 14 01:40:07.508994 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jan 14 01:40:07.509011 kernel: PCI: CLS 0 bytes, default 64 Jan 14 01:40:07.509020 kernel: kvm [1]: HYP mode not available Jan 14 01:40:07.509029 kernel: Initialise system trusted keyrings Jan 14 01:40:07.509037 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Jan 14 01:40:07.509045 kernel: Key type asymmetric registered Jan 14 01:40:07.509060 kernel: Asymmetric key parser 'x509' registered Jan 14 01:40:07.509069 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jan 14 01:40:07.509078 kernel: io scheduler mq-deadline registered Jan 14 01:40:07.509086 kernel: io scheduler kyber registered Jan 14 01:40:07.509094 kernel: io scheduler bfq registered Jan 14 01:40:07.509102 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jan 14 01:40:07.509213 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Jan 14 01:40:07.509297 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Jan 14 01:40:07.509380 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.509469 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Jan 14 01:40:07.509559 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Jan 14 01:40:07.509640 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.509739 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Jan 14 01:40:07.509835 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Jan 14 01:40:07.509927 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.510011 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Jan 14 01:40:07.510091 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Jan 14 01:40:07.510170 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.510273 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Jan 14 01:40:07.510366 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Jan 14 01:40:07.510450 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.510532 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Jan 14 01:40:07.510621 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Jan 14 01:40:07.510703 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.510821 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Jan 14 01:40:07.510906 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Jan 14 01:40:07.510986 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.511071 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Jan 14 01:40:07.511151 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Jan 14 01:40:07.511230 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.511241 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jan 14 01:40:07.511320 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Jan 14 01:40:07.511399 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Jan 14 01:40:07.511480 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.511562 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Jan 14 01:40:07.511643 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Jan 14 01:40:07.511738 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.511825 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Jan 14 01:40:07.511906 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Jan 14 01:40:07.511988 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.512070 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Jan 14 01:40:07.512154 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Jan 14 01:40:07.512234 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.512315 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Jan 14 01:40:07.512395 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Jan 14 01:40:07.512474 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.512557 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Jan 14 01:40:07.512637 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Jan 14 01:40:07.512723 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.512810 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Jan 14 01:40:07.512890 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Jan 14 01:40:07.512969 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.513055 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Jan 14 01:40:07.513136 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Jan 14 01:40:07.513215 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.513226 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jan 14 01:40:07.513304 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Jan 14 01:40:07.513385 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Jan 14 01:40:07.513468 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.513549 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Jan 14 01:40:07.513628 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Jan 14 01:40:07.513708 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.513803 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Jan 14 01:40:07.513894 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Jan 14 01:40:07.513975 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.514067 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Jan 14 01:40:07.514158 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Jan 14 01:40:07.514253 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.514338 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Jan 14 01:40:07.514417 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Jan 14 01:40:07.514497 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.514581 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Jan 14 01:40:07.514662 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Jan 14 01:40:07.514754 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.514838 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Jan 14 01:40:07.514917 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Jan 14 01:40:07.514997 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.515082 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Jan 14 01:40:07.515162 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Jan 14 01:40:07.515240 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.515251 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jan 14 01:40:07.515329 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Jan 14 01:40:07.515409 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Jan 14 01:40:07.515488 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.515573 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Jan 14 01:40:07.515653 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Jan 14 01:40:07.515748 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.515832 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Jan 14 01:40:07.515921 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Jan 14 01:40:07.516003 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.516088 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Jan 14 01:40:07.516168 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Jan 14 01:40:07.516247 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.516329 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Jan 14 01:40:07.516409 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Jan 14 01:40:07.516489 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.516573 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Jan 14 01:40:07.516652 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Jan 14 01:40:07.516743 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.516827 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Jan 14 01:40:07.516908 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Jan 14 01:40:07.516987 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.517072 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Jan 14 01:40:07.517152 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Jan 14 01:40:07.517231 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.517313 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Jan 14 01:40:07.517409 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Jan 14 01:40:07.517490 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 01:40:07.517502 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jan 14 01:40:07.517512 kernel: ACPI: button: Power Button [PWRB] Jan 14 01:40:07.517597 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Jan 14 01:40:07.517696 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jan 14 01:40:07.517708 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 14 01:40:07.517728 kernel: thunder_xcv, ver 1.0 Jan 14 01:40:07.517738 kernel: thunder_bgx, ver 1.0 Jan 14 01:40:07.517746 kernel: nicpf, ver 1.0 Jan 14 01:40:07.517757 kernel: nicvf, ver 1.0 Jan 14 01:40:07.517863 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jan 14 01:40:07.517943 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-01-14T01:40:06 UTC (1768354806) Jan 14 01:40:07.517954 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 14 01:40:07.517962 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jan 14 01:40:07.517970 kernel: watchdog: NMI not fully supported Jan 14 01:40:07.517980 kernel: watchdog: Hard watchdog permanently disabled Jan 14 01:40:07.517988 kernel: NET: Registered PF_INET6 protocol family Jan 14 01:40:07.517997 kernel: Segment Routing with IPv6 Jan 14 01:40:07.518005 kernel: In-situ OAM (IOAM) with IPv6 Jan 14 01:40:07.518013 kernel: NET: Registered PF_PACKET protocol family Jan 14 01:40:07.518021 kernel: Key type dns_resolver registered Jan 14 01:40:07.518029 kernel: registered taskstats version 1 Jan 14 01:40:07.518039 kernel: Loading compiled-in X.509 certificates Jan 14 01:40:07.518047 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.65-flatcar: 61f104a5e4017e43c6bf0c9744e6a522053d7383' Jan 14 01:40:07.518055 kernel: Demotion targets for Node 0: null Jan 14 01:40:07.518063 kernel: Key type .fscrypt registered Jan 14 01:40:07.518071 kernel: Key type fscrypt-provisioning registered Jan 14 01:40:07.518078 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 14 01:40:07.518087 kernel: ima: Allocated hash algorithm: sha1 Jan 14 01:40:07.518095 kernel: ima: No architecture policies found Jan 14 01:40:07.518105 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jan 14 01:40:07.518113 kernel: clk: Disabling unused clocks Jan 14 01:40:07.518121 kernel: PM: genpd: Disabling unused power domains Jan 14 01:40:07.518129 kernel: Freeing unused kernel memory: 12480K Jan 14 01:40:07.518137 kernel: Run /init as init process Jan 14 01:40:07.518145 kernel: with arguments: Jan 14 01:40:07.518153 kernel: /init Jan 14 01:40:07.518162 kernel: with environment: Jan 14 01:40:07.518170 kernel: HOME=/ Jan 14 01:40:07.518178 kernel: TERM=linux Jan 14 01:40:07.518186 kernel: ACPI: bus type USB registered Jan 14 01:40:07.518195 kernel: usbcore: registered new interface driver usbfs Jan 14 01:40:07.518203 kernel: usbcore: registered new interface driver hub Jan 14 01:40:07.518211 kernel: usbcore: registered new device driver usb Jan 14 01:40:07.518335 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 14 01:40:07.518422 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 14 01:40:07.518505 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 14 01:40:07.518598 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 14 01:40:07.518684 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 14 01:40:07.518783 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 14 01:40:07.518906 kernel: hub 1-0:1.0: USB hub found Jan 14 01:40:07.519009 kernel: hub 1-0:1.0: 4 ports detected Jan 14 01:40:07.519123 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 14 01:40:07.519238 kernel: hub 2-0:1.0: USB hub found Jan 14 01:40:07.519348 kernel: hub 2-0:1.0: 4 ports detected Jan 14 01:40:07.519446 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Jan 14 01:40:07.519531 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 14 01:40:07.519543 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 14 01:40:07.519552 kernel: GPT:25804799 != 104857599 Jan 14 01:40:07.519560 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 14 01:40:07.519569 kernel: GPT:25804799 != 104857599 Jan 14 01:40:07.519579 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 14 01:40:07.519587 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 14 01:40:07.519595 kernel: SCSI subsystem initialized Jan 14 01:40:07.519604 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 14 01:40:07.519612 kernel: device-mapper: uevent: version 1.0.3 Jan 14 01:40:07.519621 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 14 01:40:07.519630 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jan 14 01:40:07.519640 kernel: raid6: neonx8 gen() 15801 MB/s Jan 14 01:40:07.519648 kernel: raid6: neonx4 gen() 15540 MB/s Jan 14 01:40:07.519657 kernel: raid6: neonx2 gen() 12518 MB/s Jan 14 01:40:07.519665 kernel: raid6: neonx1 gen() 10488 MB/s Jan 14 01:40:07.519674 kernel: raid6: int64x8 gen() 6818 MB/s Jan 14 01:40:07.519682 kernel: raid6: int64x4 gen() 7297 MB/s Jan 14 01:40:07.519690 kernel: raid6: int64x2 gen() 6057 MB/s Jan 14 01:40:07.519699 kernel: raid6: int64x1 gen() 5033 MB/s Jan 14 01:40:07.519708 kernel: raid6: using algorithm neonx8 gen() 15801 MB/s Jan 14 01:40:07.519730 kernel: raid6: .... xor() 11991 MB/s, rmw enabled Jan 14 01:40:07.519740 kernel: raid6: using neon recovery algorithm Jan 14 01:40:07.519749 kernel: xor: measuring software checksum speed Jan 14 01:40:07.519759 kernel: 8regs : 21265 MB/sec Jan 14 01:40:07.519768 kernel: 32regs : 21676 MB/sec Jan 14 01:40:07.519783 kernel: arm64_neon : 26080 MB/sec Jan 14 01:40:07.519795 kernel: xor: using function: arm64_neon (26080 MB/sec) Jan 14 01:40:07.519922 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 14 01:40:07.519937 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 14 01:40:07.519946 kernel: BTRFS: device fsid 96ce121f-260d-446f-a0e2-a59fdf56d58c devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (275) Jan 14 01:40:07.519954 kernel: BTRFS info (device dm-0): first mount of filesystem 96ce121f-260d-446f-a0e2-a59fdf56d58c Jan 14 01:40:07.519965 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jan 14 01:40:07.519974 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 14 01:40:07.519983 kernel: BTRFS info (device dm-0): enabling free space tree Jan 14 01:40:07.519991 kernel: loop: module loaded Jan 14 01:40:07.519999 kernel: loop0: detected capacity change from 0 to 91840 Jan 14 01:40:07.520008 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 14 01:40:07.520117 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jan 14 01:40:07.520133 systemd[1]: Successfully made /usr/ read-only. Jan 14 01:40:07.520145 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 14 01:40:07.520154 systemd[1]: Detected virtualization kvm. Jan 14 01:40:07.520163 systemd[1]: Detected architecture arm64. Jan 14 01:40:07.520172 systemd[1]: Running in initrd. Jan 14 01:40:07.520180 systemd[1]: No hostname configured, using default hostname. Jan 14 01:40:07.520191 systemd[1]: Hostname set to . Jan 14 01:40:07.520200 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 14 01:40:07.520208 systemd[1]: Queued start job for default target initrd.target. Jan 14 01:40:07.520217 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 14 01:40:07.520226 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 14 01:40:07.520235 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 14 01:40:07.520246 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 14 01:40:07.520254 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 14 01:40:07.520264 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 14 01:40:07.520273 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 14 01:40:07.520282 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 14 01:40:07.520291 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 14 01:40:07.520301 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 14 01:40:07.520310 systemd[1]: Reached target paths.target - Path Units. Jan 14 01:40:07.520319 systemd[1]: Reached target slices.target - Slice Units. Jan 14 01:40:07.520327 systemd[1]: Reached target swap.target - Swaps. Jan 14 01:40:07.520336 systemd[1]: Reached target timers.target - Timer Units. Jan 14 01:40:07.520345 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 14 01:40:07.520355 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 14 01:40:07.520364 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 14 01:40:07.520373 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 14 01:40:07.520382 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 14 01:40:07.520391 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 14 01:40:07.520400 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 14 01:40:07.520409 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 14 01:40:07.520419 systemd[1]: Reached target sockets.target - Socket Units. Jan 14 01:40:07.520429 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 14 01:40:07.520438 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 14 01:40:07.520447 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 14 01:40:07.520456 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 14 01:40:07.520470 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 14 01:40:07.520479 systemd[1]: Starting systemd-fsck-usr.service... Jan 14 01:40:07.520491 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 14 01:40:07.520499 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 14 01:40:07.520509 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 01:40:07.520519 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 14 01:40:07.520528 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 14 01:40:07.520537 systemd[1]: Finished systemd-fsck-usr.service. Jan 14 01:40:07.520546 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 14 01:40:07.520578 systemd-journald[418]: Collecting audit messages is enabled. Jan 14 01:40:07.520600 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 14 01:40:07.520609 kernel: Bridge firewalling registered Jan 14 01:40:07.520618 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 14 01:40:07.520629 kernel: audit: type=1130 audit(1768354807.459:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.520638 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:40:07.520648 kernel: audit: type=1130 audit(1768354807.464:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.520657 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 14 01:40:07.520666 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 14 01:40:07.520675 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 14 01:40:07.520684 kernel: audit: type=1130 audit(1768354807.482:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.520693 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 14 01:40:07.520703 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 14 01:40:07.520713 kernel: audit: type=1130 audit(1768354807.493:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.520736 kernel: audit: type=1334 audit(1768354807.496:6): prog-id=6 op=LOAD Jan 14 01:40:07.520745 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 14 01:40:07.520754 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 14 01:40:07.520763 kernel: audit: type=1130 audit(1768354807.504:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.520772 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 14 01:40:07.520783 kernel: audit: type=1130 audit(1768354807.517:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.520794 systemd-journald[418]: Journal started Jan 14 01:40:07.520813 systemd-journald[418]: Runtime Journal (/run/log/journal/fbe354619c494ef3963c0cd3c0fa2216) is 8M, max 319.5M, 311.5M free. Jan 14 01:40:07.459000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.464000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.482000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.493000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.496000 audit: BPF prog-id=6 op=LOAD Jan 14 01:40:07.504000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.517000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.455859 systemd-modules-load[420]: Inserted module 'br_netfilter' Jan 14 01:40:07.523373 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 14 01:40:07.526331 systemd[1]: Started systemd-journald.service - Journal Service. Jan 14 01:40:07.525000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.529738 kernel: audit: type=1130 audit(1768354807.525:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.533857 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 14 01:40:07.552353 dracut-cmdline[447]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=a2e92265a189403c21ae2a2ae9e6d4fed0782e0e430fbcb369a7bb0db156274f Jan 14 01:40:07.553139 systemd-tmpfiles[456]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 14 01:40:07.565202 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 14 01:40:07.566002 systemd-resolved[443]: Positive Trust Anchors: Jan 14 01:40:07.566011 systemd-resolved[443]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 14 01:40:07.569000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.566014 systemd-resolved[443]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 14 01:40:07.574363 kernel: audit: type=1130 audit(1768354807.569:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.566045 systemd-resolved[443]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 14 01:40:07.593313 systemd-resolved[443]: Defaulting to hostname 'linux'. Jan 14 01:40:07.594371 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 14 01:40:07.594000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.595589 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 14 01:40:07.641760 kernel: Loading iSCSI transport class v2.0-870. Jan 14 01:40:07.652741 kernel: iscsi: registered transport (tcp) Jan 14 01:40:07.667858 kernel: iscsi: registered transport (qla4xxx) Jan 14 01:40:07.667919 kernel: QLogic iSCSI HBA Driver Jan 14 01:40:07.691364 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 14 01:40:07.711924 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 14 01:40:07.712000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.713437 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 14 01:40:07.761492 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 14 01:40:07.761000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.763985 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 14 01:40:07.765519 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 14 01:40:07.801453 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 14 01:40:07.801000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.802000 audit: BPF prog-id=7 op=LOAD Jan 14 01:40:07.802000 audit: BPF prog-id=8 op=LOAD Jan 14 01:40:07.804138 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 14 01:40:07.839410 systemd-udevd[694]: Using default interface naming scheme 'v257'. Jan 14 01:40:07.847246 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 14 01:40:07.848000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.850986 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 14 01:40:07.873070 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 14 01:40:07.874000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.875000 audit: BPF prog-id=9 op=LOAD Jan 14 01:40:07.876100 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 14 01:40:07.882474 dracut-pre-trigger[765]: rd.md=0: removing MD RAID activation Jan 14 01:40:07.907807 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 14 01:40:07.908000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.910167 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 14 01:40:07.918378 systemd-networkd[802]: lo: Link UP Jan 14 01:40:07.918386 systemd-networkd[802]: lo: Gained carrier Jan 14 01:40:07.919000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:07.919012 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 14 01:40:07.920522 systemd[1]: Reached target network.target - Network. Jan 14 01:40:07.998331 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 14 01:40:07.998000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:08.002911 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 14 01:40:08.085652 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 14 01:40:08.097690 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 14 01:40:08.104124 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jan 14 01:40:08.104157 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 14 01:40:08.104491 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jan 14 01:40:08.112238 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 14 01:40:08.123840 systemd-networkd[802]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 01:40:08.123850 systemd-networkd[802]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 14 01:40:08.124564 systemd-networkd[802]: eth0: Link UP Jan 14 01:40:08.124785 systemd-networkd[802]: eth0: Gained carrier Jan 14 01:40:08.124796 systemd-networkd[802]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 01:40:08.125731 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 14 01:40:08.132310 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 14 01:40:08.134134 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 01:40:08.134251 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:40:08.135000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:08.136258 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 01:40:08.159753 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jan 14 01:40:08.159963 kernel: usbcore: registered new interface driver usbhid Jan 14 01:40:08.159976 kernel: usbhid: USB HID core driver Jan 14 01:40:08.160243 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 01:40:08.173751 disk-uuid[878]: Primary Header is updated. Jan 14 01:40:08.173751 disk-uuid[878]: Secondary Entries is updated. Jan 14 01:40:08.173751 disk-uuid[878]: Secondary Header is updated. Jan 14 01:40:08.187158 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 14 01:40:08.187000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:08.190006 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 14 01:40:08.193656 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 14 01:40:08.195146 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 14 01:40:08.199134 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 14 01:40:08.203150 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:40:08.203000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:08.225771 systemd-networkd[802]: eth0: DHCPv4 address 10.0.35.206/25, gateway 10.0.35.129 acquired from 10.0.35.129 Jan 14 01:40:08.226418 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 14 01:40:08.228000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:09.218618 disk-uuid[882]: Warning: The kernel is still using the old partition table. Jan 14 01:40:09.218618 disk-uuid[882]: The new table will be used at the next reboot or after you Jan 14 01:40:09.218618 disk-uuid[882]: run partprobe(8) or kpartx(8) Jan 14 01:40:09.218618 disk-uuid[882]: The operation has completed successfully. Jan 14 01:40:09.228926 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 14 01:40:09.229045 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 14 01:40:09.230000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:09.230000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:09.231818 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 14 01:40:09.267774 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (910) Jan 14 01:40:09.270539 kernel: BTRFS info (device vda6): first mount of filesystem 43f26778-0bac-4551-a250-d0042cfe708e Jan 14 01:40:09.270596 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 14 01:40:09.275347 kernel: BTRFS info (device vda6): turning on async discard Jan 14 01:40:09.275420 kernel: BTRFS info (device vda6): enabling free space tree Jan 14 01:40:09.281756 kernel: BTRFS info (device vda6): last unmount of filesystem 43f26778-0bac-4551-a250-d0042cfe708e Jan 14 01:40:09.282006 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 14 01:40:09.282000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:09.283884 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 14 01:40:09.442848 ignition[929]: Ignition 2.24.0 Jan 14 01:40:09.442864 ignition[929]: Stage: fetch-offline Jan 14 01:40:09.442900 ignition[929]: no configs at "/usr/lib/ignition/base.d" Jan 14 01:40:09.442910 ignition[929]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 01:40:09.445343 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 14 01:40:09.451213 kernel: kauditd_printk_skb: 19 callbacks suppressed Jan 14 01:40:09.451237 kernel: audit: type=1130 audit(1768354809.446:30): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:09.446000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:09.443074 ignition[929]: parsed url from cmdline: "" Jan 14 01:40:09.450055 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 14 01:40:09.443078 ignition[929]: no config URL provided Jan 14 01:40:09.443083 ignition[929]: reading system config file "/usr/lib/ignition/user.ign" Jan 14 01:40:09.443091 ignition[929]: no config at "/usr/lib/ignition/user.ign" Jan 14 01:40:09.443096 ignition[929]: failed to fetch config: resource requires networking Jan 14 01:40:09.443241 ignition[929]: Ignition finished successfully Jan 14 01:40:09.485326 ignition[943]: Ignition 2.24.0 Jan 14 01:40:09.485346 ignition[943]: Stage: fetch Jan 14 01:40:09.485488 ignition[943]: no configs at "/usr/lib/ignition/base.d" Jan 14 01:40:09.485497 ignition[943]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 01:40:09.485574 ignition[943]: parsed url from cmdline: "" Jan 14 01:40:09.485577 ignition[943]: no config URL provided Jan 14 01:40:09.485581 ignition[943]: reading system config file "/usr/lib/ignition/user.ign" Jan 14 01:40:09.485587 ignition[943]: no config at "/usr/lib/ignition/user.ign" Jan 14 01:40:09.486150 ignition[943]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 14 01:40:09.486266 ignition[943]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 14 01:40:09.486283 ignition[943]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 14 01:40:09.855423 ignition[943]: GET result: OK Jan 14 01:40:09.855715 ignition[943]: parsing config with SHA512: 757886e54fd9a42b0168ead57f1846bc2ad5fb7eb0e76fcc0b7f176e4863ccff823272b3d1f2eb7b84b623ad17d52d2ecfe87be79ef01065cb3cdbfcb2508c64 Jan 14 01:40:09.860650 unknown[943]: fetched base config from "system" Jan 14 01:40:09.860660 unknown[943]: fetched base config from "system" Jan 14 01:40:09.861057 ignition[943]: fetch: fetch complete Jan 14 01:40:09.860666 unknown[943]: fetched user config from "openstack" Jan 14 01:40:09.864000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:09.861062 ignition[943]: fetch: fetch passed Jan 14 01:40:09.869466 kernel: audit: type=1130 audit(1768354809.864:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:09.863615 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 14 01:40:09.861114 ignition[943]: Ignition finished successfully Jan 14 01:40:09.865910 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 14 01:40:09.897145 ignition[951]: Ignition 2.24.0 Jan 14 01:40:09.897164 ignition[951]: Stage: kargs Jan 14 01:40:09.897312 ignition[951]: no configs at "/usr/lib/ignition/base.d" Jan 14 01:40:09.897321 ignition[951]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 01:40:09.898092 ignition[951]: kargs: kargs passed Jan 14 01:40:09.898147 ignition[951]: Ignition finished successfully Jan 14 01:40:09.902809 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 14 01:40:09.903000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:09.904699 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 14 01:40:09.908518 kernel: audit: type=1130 audit(1768354809.903:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:09.935004 ignition[958]: Ignition 2.24.0 Jan 14 01:40:09.935022 ignition[958]: Stage: disks Jan 14 01:40:09.935179 ignition[958]: no configs at "/usr/lib/ignition/base.d" Jan 14 01:40:09.938071 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 14 01:40:09.939000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:09.935187 ignition[958]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 01:40:09.944412 kernel: audit: type=1130 audit(1768354809.939:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:09.939372 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 14 01:40:09.935961 ignition[958]: disks: disks passed Jan 14 01:40:09.943475 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 14 01:40:09.936009 ignition[958]: Ignition finished successfully Jan 14 01:40:09.945508 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 14 01:40:09.947398 systemd[1]: Reached target sysinit.target - System Initialization. Jan 14 01:40:09.948757 systemd[1]: Reached target basic.target - Basic System. Jan 14 01:40:09.951475 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 14 01:40:10.005182 systemd-fsck[967]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 14 01:40:10.008955 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 14 01:40:10.010000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:10.011827 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 14 01:40:10.015929 kernel: audit: type=1130 audit(1768354810.010:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:10.050953 systemd-networkd[802]: eth0: Gained IPv6LL Jan 14 01:40:10.112781 kernel: EXT4-fs (vda9): mounted filesystem b1eb7e1a-01a1-41b0-9b3c-5a37b4853d4d r/w with ordered data mode. Quota mode: none. Jan 14 01:40:10.113387 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 14 01:40:10.114667 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 14 01:40:10.118652 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 14 01:40:10.120525 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 14 01:40:10.121823 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 14 01:40:10.122458 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 14 01:40:10.125831 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 14 01:40:10.125866 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 14 01:40:10.134711 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 14 01:40:10.137056 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 14 01:40:10.141742 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (975) Jan 14 01:40:10.146043 kernel: BTRFS info (device vda6): first mount of filesystem 43f26778-0bac-4551-a250-d0042cfe708e Jan 14 01:40:10.146136 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 14 01:40:10.150192 kernel: BTRFS info (device vda6): turning on async discard Jan 14 01:40:10.150297 kernel: BTRFS info (device vda6): enabling free space tree Jan 14 01:40:10.151522 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 14 01:40:10.189746 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:40:10.317829 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 14 01:40:10.318000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:10.320183 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 14 01:40:10.323876 kernel: audit: type=1130 audit(1768354810.318:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:10.323780 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 14 01:40:10.339769 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 14 01:40:10.341486 kernel: BTRFS info (device vda6): last unmount of filesystem 43f26778-0bac-4551-a250-d0042cfe708e Jan 14 01:40:10.359972 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 14 01:40:10.360000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:10.364741 kernel: audit: type=1130 audit(1768354810.360:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:10.371273 ignition[1077]: INFO : Ignition 2.24.0 Jan 14 01:40:10.371273 ignition[1077]: INFO : Stage: mount Jan 14 01:40:10.372874 ignition[1077]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 14 01:40:10.372874 ignition[1077]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 01:40:10.372874 ignition[1077]: INFO : mount: mount passed Jan 14 01:40:10.372874 ignition[1077]: INFO : Ignition finished successfully Jan 14 01:40:10.375000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:10.375192 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 14 01:40:10.380589 kernel: audit: type=1130 audit(1768354810.375:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:11.224754 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:40:13.235746 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:40:17.245756 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:40:17.250505 coreos-metadata[977]: Jan 14 01:40:17.249 WARN failed to locate config-drive, using the metadata service API instead Jan 14 01:40:17.268975 coreos-metadata[977]: Jan 14 01:40:17.268 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 14 01:40:17.413529 coreos-metadata[977]: Jan 14 01:40:17.413 INFO Fetch successful Jan 14 01:40:17.413529 coreos-metadata[977]: Jan 14 01:40:17.413 INFO wrote hostname ci-4578-0-0-p-96753e66ce to /sysroot/etc/hostname Jan 14 01:40:17.417787 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 14 01:40:17.419160 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 14 01:40:17.428144 kernel: audit: type=1130 audit(1768354817.421:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:17.428172 kernel: audit: type=1131 audit(1768354817.421:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:17.421000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:17.421000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:17.423217 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 14 01:40:17.443128 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 14 01:40:17.476777 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1093) Jan 14 01:40:17.480810 kernel: BTRFS info (device vda6): first mount of filesystem 43f26778-0bac-4551-a250-d0042cfe708e Jan 14 01:40:17.480835 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 14 01:40:17.485477 kernel: BTRFS info (device vda6): turning on async discard Jan 14 01:40:17.485566 kernel: BTRFS info (device vda6): enabling free space tree Jan 14 01:40:17.487056 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 14 01:40:17.517514 ignition[1111]: INFO : Ignition 2.24.0 Jan 14 01:40:17.517514 ignition[1111]: INFO : Stage: files Jan 14 01:40:17.519591 ignition[1111]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 14 01:40:17.519591 ignition[1111]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 01:40:17.519591 ignition[1111]: DEBUG : files: compiled without relabeling support, skipping Jan 14 01:40:17.519591 ignition[1111]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 14 01:40:17.519591 ignition[1111]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 14 01:40:17.527261 ignition[1111]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 14 01:40:17.527261 ignition[1111]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 14 01:40:17.527261 ignition[1111]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 14 01:40:17.525174 unknown[1111]: wrote ssh authorized keys file for user: core Jan 14 01:40:17.532043 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 14 01:40:17.532043 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Jan 14 01:40:17.585989 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 14 01:40:17.690832 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 14 01:40:17.692786 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 14 01:40:17.692786 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 14 01:40:17.692786 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 14 01:40:17.692786 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 14 01:40:17.692786 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 14 01:40:17.692786 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 14 01:40:17.692786 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 14 01:40:17.692786 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 14 01:40:17.706156 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 14 01:40:17.706156 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 14 01:40:17.706156 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 14 01:40:17.706156 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 14 01:40:17.706156 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 14 01:40:17.706156 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Jan 14 01:40:17.974755 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 14 01:40:18.538628 ignition[1111]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 14 01:40:18.538628 ignition[1111]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 14 01:40:18.543387 ignition[1111]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 14 01:40:18.547171 ignition[1111]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 14 01:40:18.547171 ignition[1111]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 14 01:40:18.550076 ignition[1111]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 14 01:40:18.550076 ignition[1111]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 14 01:40:18.550076 ignition[1111]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 14 01:40:18.550076 ignition[1111]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 14 01:40:18.550076 ignition[1111]: INFO : files: files passed Jan 14 01:40:18.550076 ignition[1111]: INFO : Ignition finished successfully Jan 14 01:40:18.562392 kernel: audit: type=1130 audit(1768354818.552:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.552000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.551308 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 14 01:40:18.553329 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 14 01:40:18.557501 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 14 01:40:18.574236 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 14 01:40:18.574347 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 14 01:40:18.576000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.576000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.582208 kernel: audit: type=1130 audit(1768354818.576:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.582247 kernel: audit: type=1131 audit(1768354818.576:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.582266 initrd-setup-root-after-ignition[1144]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 14 01:40:18.582266 initrd-setup-root-after-ignition[1144]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 14 01:40:18.585183 initrd-setup-root-after-ignition[1148]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 14 01:40:18.590316 kernel: audit: type=1130 audit(1768354818.586:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.586000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.584861 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 14 01:40:18.586536 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 14 01:40:18.592261 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 14 01:40:18.628292 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 14 01:40:18.628417 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 14 01:40:18.636596 kernel: audit: type=1130 audit(1768354818.630:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.636621 kernel: audit: type=1131 audit(1768354818.630:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.630000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.630000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.630617 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 14 01:40:18.637540 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 14 01:40:18.639530 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 14 01:40:18.640440 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 14 01:40:18.674344 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 14 01:40:18.675000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.676801 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 14 01:40:18.681069 kernel: audit: type=1130 audit(1768354818.675:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.698260 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 14 01:40:18.698461 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 14 01:40:18.700694 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 14 01:40:18.702767 systemd[1]: Stopped target timers.target - Timer Units. Jan 14 01:40:18.704550 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 14 01:40:18.705000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.704681 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 14 01:40:18.710438 kernel: audit: type=1131 audit(1768354818.705:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.709449 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 14 01:40:18.711434 systemd[1]: Stopped target basic.target - Basic System. Jan 14 01:40:18.712995 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 14 01:40:18.714654 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 14 01:40:18.716571 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 14 01:40:18.718505 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 14 01:40:18.720360 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 14 01:40:18.722095 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 14 01:40:18.723882 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 14 01:40:18.725780 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 14 01:40:18.727510 systemd[1]: Stopped target swap.target - Swaps. Jan 14 01:40:18.728953 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 14 01:40:18.729000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.729091 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 14 01:40:18.731325 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 14 01:40:18.733217 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 14 01:40:18.735028 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 14 01:40:18.735888 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 14 01:40:18.738000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.737053 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 14 01:40:18.737169 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 14 01:40:18.741000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.739901 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 14 01:40:18.742000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.740013 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 14 01:40:18.741895 systemd[1]: ignition-files.service: Deactivated successfully. Jan 14 01:40:18.741997 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 14 01:40:18.748000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.744614 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 14 01:40:18.746526 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 14 01:40:18.746662 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 14 01:40:18.769396 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 14 01:40:18.770289 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 14 01:40:18.771000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.770421 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 14 01:40:18.773000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.772316 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 14 01:40:18.775000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.772427 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 14 01:40:18.774181 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 14 01:40:18.774301 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 14 01:40:18.780211 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 14 01:40:18.780313 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 14 01:40:18.781000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.781000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.784403 ignition[1168]: INFO : Ignition 2.24.0 Jan 14 01:40:18.784403 ignition[1168]: INFO : Stage: umount Jan 14 01:40:18.784403 ignition[1168]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 14 01:40:18.784403 ignition[1168]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 14 01:40:18.784403 ignition[1168]: INFO : umount: umount passed Jan 14 01:40:18.784403 ignition[1168]: INFO : Ignition finished successfully Jan 14 01:40:18.789000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.784859 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 14 01:40:18.795000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.796000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.784960 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 14 01:40:18.798000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.790984 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 14 01:40:18.791397 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 14 01:40:18.802000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.791435 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 14 01:40:18.796248 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 14 01:40:18.796311 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 14 01:40:18.797397 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 14 01:40:18.797447 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 14 01:40:18.799537 systemd[1]: Stopped target network.target - Network. Jan 14 01:40:18.801209 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 14 01:40:18.801268 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 14 01:40:18.802998 systemd[1]: Stopped target paths.target - Path Units. Jan 14 01:40:18.805686 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 14 01:40:18.809783 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 14 01:40:18.824000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.810983 systemd[1]: Stopped target slices.target - Slice Units. Jan 14 01:40:18.827000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.813899 systemd[1]: Stopped target sockets.target - Socket Units. Jan 14 01:40:18.815582 systemd[1]: iscsid.socket: Deactivated successfully. Jan 14 01:40:18.815628 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 14 01:40:18.817497 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 14 01:40:18.817530 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 14 01:40:18.819429 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 14 01:40:18.819451 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 14 01:40:18.822782 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 14 01:40:18.822842 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 14 01:40:18.824670 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 14 01:40:18.824715 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 14 01:40:18.842000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.827244 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 14 01:40:18.828782 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 14 01:40:18.841053 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 14 01:40:18.841162 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 14 01:40:18.849000 audit: BPF prog-id=6 op=UNLOAD Jan 14 01:40:18.848396 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 14 01:40:18.849000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.848516 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 14 01:40:18.852752 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 14 01:40:18.853817 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 14 01:40:18.853863 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 14 01:40:18.856601 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 14 01:40:18.859000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.857491 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 14 01:40:18.861000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.857554 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 14 01:40:18.863000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.859681 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 14 01:40:18.864000 audit: BPF prog-id=9 op=UNLOAD Jan 14 01:40:18.859741 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 14 01:40:18.867000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.861881 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 14 01:40:18.869000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.861928 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 14 01:40:18.863904 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 14 01:40:18.866239 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 14 01:40:18.866339 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 14 01:40:18.868030 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 14 01:40:18.868120 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 14 01:40:18.878482 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 14 01:40:18.881927 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 14 01:40:18.883000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.883565 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 14 01:40:18.883605 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 14 01:40:18.885308 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 14 01:40:18.888000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.885343 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 14 01:40:18.887066 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 14 01:40:18.891000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.887126 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 14 01:40:18.889622 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 14 01:40:18.894000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.889670 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 14 01:40:18.892433 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 14 01:40:18.892486 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 14 01:40:18.899000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.896163 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 14 01:40:18.900000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.897185 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 14 01:40:18.903000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.897250 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 14 01:40:18.899154 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 14 01:40:18.899202 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 14 01:40:18.901305 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 01:40:18.901354 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:40:18.903805 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 14 01:40:18.922023 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 14 01:40:18.922000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.928434 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 14 01:40:18.928543 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 14 01:40:18.930000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.930000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:18.931079 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 14 01:40:18.932788 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 14 01:40:18.967588 systemd[1]: Switching root. Jan 14 01:40:18.997034 systemd-journald[418]: Journal stopped Jan 14 01:40:19.976137 systemd-journald[418]: Received SIGTERM from PID 1 (systemd). Jan 14 01:40:19.976218 kernel: SELinux: policy capability network_peer_controls=1 Jan 14 01:40:19.976237 kernel: SELinux: policy capability open_perms=1 Jan 14 01:40:19.976250 kernel: SELinux: policy capability extended_socket_class=1 Jan 14 01:40:19.976263 kernel: SELinux: policy capability always_check_network=0 Jan 14 01:40:19.976276 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 14 01:40:19.976289 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 14 01:40:19.976300 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 14 01:40:19.976312 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 14 01:40:19.976323 kernel: SELinux: policy capability userspace_initial_context=0 Jan 14 01:40:19.976334 systemd[1]: Successfully loaded SELinux policy in 67.002ms. Jan 14 01:40:19.976356 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.883ms. Jan 14 01:40:19.976371 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 14 01:40:19.976386 systemd[1]: Detected virtualization kvm. Jan 14 01:40:19.976397 systemd[1]: Detected architecture arm64. Jan 14 01:40:19.976410 systemd[1]: Detected first boot. Jan 14 01:40:19.976443 systemd[1]: Hostname set to . Jan 14 01:40:19.976455 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 14 01:40:19.976467 zram_generator::config[1216]: No configuration found. Jan 14 01:40:19.976482 kernel: NET: Registered PF_VSOCK protocol family Jan 14 01:40:19.976492 systemd[1]: Populated /etc with preset unit settings. Jan 14 01:40:19.976503 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 14 01:40:19.976514 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 14 01:40:19.976525 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 14 01:40:19.976539 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 14 01:40:19.976550 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 14 01:40:19.976561 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 14 01:40:19.976573 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 14 01:40:19.976584 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 14 01:40:19.976595 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 14 01:40:19.976608 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 14 01:40:19.976620 systemd[1]: Created slice user.slice - User and Session Slice. Jan 14 01:40:19.976631 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 14 01:40:19.976642 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 14 01:40:19.976653 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 14 01:40:19.976663 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 14 01:40:19.976675 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 14 01:40:19.976686 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 14 01:40:19.976699 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jan 14 01:40:19.976709 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 14 01:40:19.976731 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 14 01:40:19.976743 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 14 01:40:19.976754 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 14 01:40:19.976766 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 14 01:40:19.976777 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 14 01:40:19.976788 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 14 01:40:19.976799 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 14 01:40:19.976810 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 14 01:40:19.976821 systemd[1]: Reached target slices.target - Slice Units. Jan 14 01:40:19.976832 systemd[1]: Reached target swap.target - Swaps. Jan 14 01:40:19.976844 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 14 01:40:19.976855 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 14 01:40:19.976865 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 14 01:40:19.976876 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 14 01:40:19.976887 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 14 01:40:19.976898 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 14 01:40:19.976909 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 14 01:40:19.976921 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 14 01:40:19.976932 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 14 01:40:19.976942 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 14 01:40:19.976953 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 14 01:40:19.976964 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 14 01:40:19.976974 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 14 01:40:19.976985 systemd[1]: Mounting media.mount - External Media Directory... Jan 14 01:40:19.976997 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 14 01:40:19.977008 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 14 01:40:19.977018 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 14 01:40:19.977030 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 14 01:40:19.977041 systemd[1]: Reached target machines.target - Containers. Jan 14 01:40:19.977051 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 14 01:40:19.977062 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 14 01:40:19.977074 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 14 01:40:19.977085 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 14 01:40:19.977096 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 14 01:40:19.977107 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 14 01:40:19.977119 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 14 01:40:19.977130 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 14 01:40:19.977141 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 14 01:40:19.977153 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 14 01:40:19.977164 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 14 01:40:19.977175 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 14 01:40:19.977187 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 14 01:40:19.977205 systemd[1]: Stopped systemd-fsck-usr.service. Jan 14 01:40:19.977220 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 14 01:40:19.977232 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 14 01:40:19.977245 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 14 01:40:19.977256 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 14 01:40:19.977268 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 14 01:40:19.977280 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 14 01:40:19.977290 kernel: fuse: init (API version 7.41) Jan 14 01:40:19.977301 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 14 01:40:19.977311 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 14 01:40:19.977322 kernel: ACPI: bus type drm_connector registered Jan 14 01:40:19.977332 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 14 01:40:19.977343 systemd[1]: Mounted media.mount - External Media Directory. Jan 14 01:40:19.977373 systemd-journald[1291]: Collecting audit messages is enabled. Jan 14 01:40:19.977408 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 14 01:40:19.977421 systemd-journald[1291]: Journal started Jan 14 01:40:19.977444 systemd-journald[1291]: Runtime Journal (/run/log/journal/fbe354619c494ef3963c0cd3c0fa2216) is 8M, max 319.5M, 311.5M free. Jan 14 01:40:19.817000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 14 01:40:19.925000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:19.927000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:19.930000 audit: BPF prog-id=14 op=UNLOAD Jan 14 01:40:19.930000 audit: BPF prog-id=13 op=UNLOAD Jan 14 01:40:19.933000 audit: BPF prog-id=15 op=LOAD Jan 14 01:40:19.933000 audit: BPF prog-id=16 op=LOAD Jan 14 01:40:19.933000 audit: BPF prog-id=17 op=LOAD Jan 14 01:40:19.973000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 14 01:40:19.973000 audit[1291]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=6 a1=fffff0596860 a2=4000 a3=0 items=0 ppid=1 pid=1291 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:19.973000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 14 01:40:19.721513 systemd[1]: Queued start job for default target multi-user.target. Jan 14 01:40:19.742230 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 14 01:40:19.742704 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 14 01:40:19.980739 systemd[1]: Started systemd-journald.service - Journal Service. Jan 14 01:40:19.981000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:19.982973 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 14 01:40:19.984191 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 14 01:40:19.985431 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 14 01:40:19.987544 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 14 01:40:19.986000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:19.988000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:19.989011 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 14 01:40:19.989178 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 14 01:40:19.989000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:19.989000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:19.990700 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 14 01:40:19.990897 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 14 01:40:19.991000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:19.991000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:19.992288 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 14 01:40:19.992447 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 14 01:40:19.994000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:19.994000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:19.995106 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 14 01:40:19.995279 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 14 01:40:19.995000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:19.995000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:19.996829 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 14 01:40:19.996987 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 14 01:40:19.997000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:19.997000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:19.998374 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 14 01:40:19.998536 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 14 01:40:19.998000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:19.998000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:19.999987 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 14 01:40:20.000000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.001550 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 14 01:40:20.002000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.003824 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 14 01:40:20.004000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.005572 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 14 01:40:20.006000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.017471 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 14 01:40:20.019846 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 14 01:40:20.022128 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 14 01:40:20.024102 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 14 01:40:20.025191 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 14 01:40:20.025220 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 14 01:40:20.027030 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 14 01:40:20.028385 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 14 01:40:20.028492 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 14 01:40:20.032891 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 14 01:40:20.034844 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 14 01:40:20.035974 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 14 01:40:20.040889 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 14 01:40:20.042050 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 14 01:40:20.043960 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 14 01:40:20.047254 systemd-journald[1291]: Time spent on flushing to /var/log/journal/fbe354619c494ef3963c0cd3c0fa2216 is 31.549ms for 1810 entries. Jan 14 01:40:20.047254 systemd-journald[1291]: System Journal (/var/log/journal/fbe354619c494ef3963c0cd3c0fa2216) is 8M, max 588.1M, 580.1M free. Jan 14 01:40:20.094431 systemd-journald[1291]: Received client request to flush runtime journal. Jan 14 01:40:20.094495 kernel: loop1: detected capacity change from 0 to 100192 Jan 14 01:40:20.057000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.067000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.077000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.050869 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 14 01:40:20.054125 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 14 01:40:20.056821 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 14 01:40:20.059134 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 14 01:40:20.060598 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 14 01:40:20.066897 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 14 01:40:20.069406 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 14 01:40:20.072471 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 14 01:40:20.075881 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 14 01:40:20.096926 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 14 01:40:20.098000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.105897 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 14 01:40:20.106000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.115506 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 14 01:40:20.116000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.117000 audit: BPF prog-id=18 op=LOAD Jan 14 01:40:20.117000 audit: BPF prog-id=19 op=LOAD Jan 14 01:40:20.117000 audit: BPF prog-id=20 op=LOAD Jan 14 01:40:20.118611 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 14 01:40:20.119000 audit: BPF prog-id=21 op=LOAD Jan 14 01:40:20.121309 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 14 01:40:20.125881 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 14 01:40:20.127000 audit: BPF prog-id=22 op=LOAD Jan 14 01:40:20.127000 audit: BPF prog-id=23 op=LOAD Jan 14 01:40:20.127000 audit: BPF prog-id=24 op=LOAD Jan 14 01:40:20.129033 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 14 01:40:20.130000 audit: BPF prog-id=25 op=LOAD Jan 14 01:40:20.130000 audit: BPF prog-id=26 op=LOAD Jan 14 01:40:20.130000 audit: BPF prog-id=27 op=LOAD Jan 14 01:40:20.131479 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 14 01:40:20.139741 kernel: loop2: detected capacity change from 0 to 1648 Jan 14 01:40:20.159185 systemd-tmpfiles[1356]: ACLs are not supported, ignoring. Jan 14 01:40:20.159207 systemd-tmpfiles[1356]: ACLs are not supported, ignoring. Jan 14 01:40:20.164847 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 14 01:40:20.165964 systemd-nsresourced[1357]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 14 01:40:20.165000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.167361 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 14 01:40:20.168000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.169231 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 14 01:40:20.171764 kernel: loop3: detected capacity change from 0 to 45344 Jan 14 01:40:20.171000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.212697 systemd-oomd[1354]: No swap; memory pressure usage will be degraded Jan 14 01:40:20.214289 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 14 01:40:20.215000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.215748 kernel: loop4: detected capacity change from 0 to 211168 Jan 14 01:40:20.224542 systemd-resolved[1355]: Positive Trust Anchors: Jan 14 01:40:20.224564 systemd-resolved[1355]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 14 01:40:20.224568 systemd-resolved[1355]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 14 01:40:20.224599 systemd-resolved[1355]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 14 01:40:20.235526 systemd-resolved[1355]: Using system hostname 'ci-4578-0-0-p-96753e66ce'. Jan 14 01:40:20.236858 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 14 01:40:20.237000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.238172 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 14 01:40:20.262776 kernel: loop5: detected capacity change from 0 to 100192 Jan 14 01:40:20.273741 kernel: loop6: detected capacity change from 0 to 1648 Jan 14 01:40:20.278745 kernel: loop7: detected capacity change from 0 to 45344 Jan 14 01:40:20.290742 kernel: loop1: detected capacity change from 0 to 211168 Jan 14 01:40:20.304355 (sd-merge)[1380]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Jan 14 01:40:20.307457 (sd-merge)[1380]: Merged extensions into '/usr'. Jan 14 01:40:20.311320 systemd[1]: Reload requested from client PID 1336 ('systemd-sysext') (unit systemd-sysext.service)... Jan 14 01:40:20.311337 systemd[1]: Reloading... Jan 14 01:40:20.354205 zram_generator::config[1410]: No configuration found. Jan 14 01:40:20.507686 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 14 01:40:20.507865 systemd[1]: Reloading finished in 196 ms. Jan 14 01:40:20.539196 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 14 01:40:20.540000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.540759 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 14 01:40:20.541000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.561456 systemd[1]: Starting ensure-sysext.service... Jan 14 01:40:20.563361 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 14 01:40:20.564000 audit: BPF prog-id=8 op=UNLOAD Jan 14 01:40:20.564000 audit: BPF prog-id=7 op=UNLOAD Jan 14 01:40:20.564000 audit: BPF prog-id=28 op=LOAD Jan 14 01:40:20.564000 audit: BPF prog-id=29 op=LOAD Jan 14 01:40:20.565870 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 14 01:40:20.567000 audit: BPF prog-id=30 op=LOAD Jan 14 01:40:20.567000 audit: BPF prog-id=25 op=UNLOAD Jan 14 01:40:20.567000 audit: BPF prog-id=31 op=LOAD Jan 14 01:40:20.567000 audit: BPF prog-id=32 op=LOAD Jan 14 01:40:20.567000 audit: BPF prog-id=26 op=UNLOAD Jan 14 01:40:20.567000 audit: BPF prog-id=27 op=UNLOAD Jan 14 01:40:20.567000 audit: BPF prog-id=33 op=LOAD Jan 14 01:40:20.567000 audit: BPF prog-id=18 op=UNLOAD Jan 14 01:40:20.567000 audit: BPF prog-id=34 op=LOAD Jan 14 01:40:20.567000 audit: BPF prog-id=35 op=LOAD Jan 14 01:40:20.568000 audit: BPF prog-id=19 op=UNLOAD Jan 14 01:40:20.568000 audit: BPF prog-id=20 op=UNLOAD Jan 14 01:40:20.568000 audit: BPF prog-id=36 op=LOAD Jan 14 01:40:20.568000 audit: BPF prog-id=21 op=UNLOAD Jan 14 01:40:20.569000 audit: BPF prog-id=37 op=LOAD Jan 14 01:40:20.569000 audit: BPF prog-id=15 op=UNLOAD Jan 14 01:40:20.569000 audit: BPF prog-id=38 op=LOAD Jan 14 01:40:20.569000 audit: BPF prog-id=39 op=LOAD Jan 14 01:40:20.569000 audit: BPF prog-id=16 op=UNLOAD Jan 14 01:40:20.569000 audit: BPF prog-id=17 op=UNLOAD Jan 14 01:40:20.570000 audit: BPF prog-id=40 op=LOAD Jan 14 01:40:20.570000 audit: BPF prog-id=22 op=UNLOAD Jan 14 01:40:20.570000 audit: BPF prog-id=41 op=LOAD Jan 14 01:40:20.570000 audit: BPF prog-id=42 op=LOAD Jan 14 01:40:20.570000 audit: BPF prog-id=23 op=UNLOAD Jan 14 01:40:20.570000 audit: BPF prog-id=24 op=UNLOAD Jan 14 01:40:20.578604 systemd-tmpfiles[1448]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 14 01:40:20.578636 systemd-tmpfiles[1448]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 14 01:40:20.578902 systemd-tmpfiles[1448]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 14 01:40:20.579178 systemd[1]: Reload requested from client PID 1447 ('systemctl') (unit ensure-sysext.service)... Jan 14 01:40:20.579193 systemd[1]: Reloading... Jan 14 01:40:20.579867 systemd-tmpfiles[1448]: ACLs are not supported, ignoring. Jan 14 01:40:20.579921 systemd-tmpfiles[1448]: ACLs are not supported, ignoring. Jan 14 01:40:20.589559 systemd-tmpfiles[1448]: Detected autofs mount point /boot during canonicalization of boot. Jan 14 01:40:20.589576 systemd-tmpfiles[1448]: Skipping /boot Jan 14 01:40:20.593885 systemd-udevd[1449]: Using default interface naming scheme 'v257'. Jan 14 01:40:20.596022 systemd-tmpfiles[1448]: Detected autofs mount point /boot during canonicalization of boot. Jan 14 01:40:20.596041 systemd-tmpfiles[1448]: Skipping /boot Jan 14 01:40:20.632840 zram_generator::config[1481]: No configuration found. Jan 14 01:40:20.740773 kernel: mousedev: PS/2 mouse device common for all mice Jan 14 01:40:20.823058 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 14 01:40:20.824694 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jan 14 01:40:20.824913 systemd[1]: Reloading finished in 245 ms. Jan 14 01:40:20.835803 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 14 01:40:20.836000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.838000 audit: BPF prog-id=43 op=LOAD Jan 14 01:40:20.838000 audit: BPF prog-id=44 op=LOAD Jan 14 01:40:20.838000 audit: BPF prog-id=28 op=UNLOAD Jan 14 01:40:20.838000 audit: BPF prog-id=29 op=UNLOAD Jan 14 01:40:20.839000 audit: BPF prog-id=45 op=LOAD Jan 14 01:40:20.839000 audit: BPF prog-id=33 op=UNLOAD Jan 14 01:40:20.839000 audit: BPF prog-id=46 op=LOAD Jan 14 01:40:20.839000 audit: BPF prog-id=47 op=LOAD Jan 14 01:40:20.839000 audit: BPF prog-id=34 op=UNLOAD Jan 14 01:40:20.839000 audit: BPF prog-id=35 op=UNLOAD Jan 14 01:40:20.839000 audit: BPF prog-id=48 op=LOAD Jan 14 01:40:20.839000 audit: BPF prog-id=37 op=UNLOAD Jan 14 01:40:20.840000 audit: BPF prog-id=49 op=LOAD Jan 14 01:40:20.840000 audit: BPF prog-id=50 op=LOAD Jan 14 01:40:20.840000 audit: BPF prog-id=38 op=UNLOAD Jan 14 01:40:20.840000 audit: BPF prog-id=39 op=UNLOAD Jan 14 01:40:20.840000 audit: BPF prog-id=51 op=LOAD Jan 14 01:40:20.840000 audit: BPF prog-id=30 op=UNLOAD Jan 14 01:40:20.840000 audit: BPF prog-id=52 op=LOAD Jan 14 01:40:20.840000 audit: BPF prog-id=53 op=LOAD Jan 14 01:40:20.840000 audit: BPF prog-id=31 op=UNLOAD Jan 14 01:40:20.840000 audit: BPF prog-id=32 op=UNLOAD Jan 14 01:40:20.841000 audit: BPF prog-id=54 op=LOAD Jan 14 01:40:20.841000 audit: BPF prog-id=36 op=UNLOAD Jan 14 01:40:20.841000 audit: BPF prog-id=55 op=LOAD Jan 14 01:40:20.841000 audit: BPF prog-id=40 op=UNLOAD Jan 14 01:40:20.841000 audit: BPF prog-id=56 op=LOAD Jan 14 01:40:20.841000 audit: BPF prog-id=57 op=LOAD Jan 14 01:40:20.841000 audit: BPF prog-id=41 op=UNLOAD Jan 14 01:40:20.841000 audit: BPF prog-id=42 op=UNLOAD Jan 14 01:40:20.855173 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 14 01:40:20.856000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.859050 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Jan 14 01:40:20.859113 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 14 01:40:20.859131 kernel: [drm] features: -context_init Jan 14 01:40:20.865747 kernel: [drm] number of scanouts: 1 Jan 14 01:40:20.865828 kernel: [drm] number of cap sets: 0 Jan 14 01:40:20.873744 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Jan 14 01:40:20.878747 kernel: Console: switching to colour frame buffer device 160x50 Jan 14 01:40:20.879621 systemd[1]: Finished ensure-sysext.service. Jan 14 01:40:20.899271 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 14 01:40:20.900000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.918160 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 14 01:40:20.922914 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 14 01:40:20.924142 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 14 01:40:20.948202 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 14 01:40:20.950409 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 14 01:40:20.952450 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 14 01:40:20.954922 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 14 01:40:20.959652 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 14 01:40:20.960963 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 14 01:40:20.961071 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 14 01:40:20.962068 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 14 01:40:20.964074 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 14 01:40:20.965488 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 14 01:40:20.967123 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 14 01:40:20.972000 audit: BPF prog-id=58 op=LOAD Jan 14 01:40:20.974263 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 14 01:40:20.976824 systemd[1]: Reached target time-set.target - System Time Set. Jan 14 01:40:20.979277 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 14 01:40:20.979819 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 14 01:40:20.979973 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 14 01:40:20.983941 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 01:40:20.986178 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 14 01:40:20.986451 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 14 01:40:20.986779 kernel: PTP clock support registered Jan 14 01:40:20.986000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.986000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.987885 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 14 01:40:20.988068 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 14 01:40:20.988000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.988000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.989444 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 14 01:40:20.989631 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 14 01:40:20.992000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.992000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.993151 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 14 01:40:20.998027 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 14 01:40:20.998000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.998000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:20.999798 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 14 01:40:20.999984 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 14 01:40:21.000000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:21.000000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:21.001682 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 14 01:40:21.002000 audit[1587]: SYSTEM_BOOT pid=1587 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 14 01:40:21.002000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:21.012538 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 14 01:40:21.012636 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 14 01:40:21.018980 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 14 01:40:21.019000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:21.021289 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 14 01:40:21.021000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:21.045000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 14 01:40:21.045000 audit[1614]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffc89342a0 a2=420 a3=0 items=0 ppid=1569 pid=1614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:21.045000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 14 01:40:21.046850 augenrules[1614]: No rules Jan 14 01:40:21.047850 systemd[1]: audit-rules.service: Deactivated successfully. Jan 14 01:40:21.048241 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 14 01:40:21.073068 systemd-networkd[1586]: lo: Link UP Jan 14 01:40:21.073079 systemd-networkd[1586]: lo: Gained carrier Jan 14 01:40:21.074895 systemd-networkd[1586]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 01:40:21.074907 systemd-networkd[1586]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 14 01:40:21.075087 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 14 01:40:21.076257 systemd-networkd[1586]: eth0: Link UP Jan 14 01:40:21.076557 systemd[1]: Reached target network.target - Network. Jan 14 01:40:21.076724 systemd-networkd[1586]: eth0: Gained carrier Jan 14 01:40:21.076743 systemd-networkd[1586]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 01:40:21.080910 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 14 01:40:21.083589 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 14 01:40:21.086829 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 01:40:21.100344 systemd-networkd[1586]: eth0: DHCPv4 address 10.0.35.206/25, gateway 10.0.35.129 acquired from 10.0.35.129 Jan 14 01:40:21.109088 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 14 01:40:21.110832 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 14 01:40:21.113785 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 14 01:40:21.569920 ldconfig[1577]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 14 01:40:21.574316 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 14 01:40:21.578864 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 14 01:40:21.609398 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 14 01:40:21.610841 systemd[1]: Reached target sysinit.target - System Initialization. Jan 14 01:40:21.611929 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 14 01:40:21.613101 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 14 01:40:21.614489 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 14 01:40:21.615632 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 14 01:40:21.616902 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 14 01:40:21.618140 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 14 01:40:21.619200 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 14 01:40:21.620406 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 14 01:40:21.620443 systemd[1]: Reached target paths.target - Path Units. Jan 14 01:40:21.621322 systemd[1]: Reached target timers.target - Timer Units. Jan 14 01:40:21.624098 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 14 01:40:21.626652 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 14 01:40:21.629617 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 14 01:40:21.631193 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 14 01:40:21.632439 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 14 01:40:21.635792 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 14 01:40:21.637071 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 14 01:40:21.638831 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 14 01:40:21.639924 systemd[1]: Reached target sockets.target - Socket Units. Jan 14 01:40:21.640818 systemd[1]: Reached target basic.target - Basic System. Jan 14 01:40:21.641696 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 14 01:40:21.641764 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 14 01:40:21.644440 systemd[1]: Starting chronyd.service - NTP client/server... Jan 14 01:40:21.646256 systemd[1]: Starting containerd.service - containerd container runtime... Jan 14 01:40:21.648425 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 14 01:40:21.651893 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 14 01:40:21.653857 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 14 01:40:21.656749 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:40:21.657991 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 14 01:40:21.659993 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 14 01:40:21.661983 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 14 01:40:21.668244 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 14 01:40:21.671853 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 14 01:40:21.672623 jq[1641]: false Jan 14 01:40:21.674892 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 14 01:40:21.679404 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 14 01:40:21.683018 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 14 01:40:21.685204 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 14 01:40:21.685693 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 14 01:40:21.686140 extend-filesystems[1643]: Found /dev/vda6 Jan 14 01:40:21.686333 systemd[1]: Starting update-engine.service - Update Engine... Jan 14 01:40:21.688943 chronyd[1635]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 14 01:40:21.689985 chronyd[1635]: Loaded seccomp filter (level 2) Jan 14 01:40:21.690232 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 14 01:40:21.692541 extend-filesystems[1643]: Found /dev/vda9 Jan 14 01:40:21.693785 systemd[1]: Started chronyd.service - NTP client/server. Jan 14 01:40:21.695029 extend-filesystems[1643]: Checking size of /dev/vda9 Jan 14 01:40:21.703791 jq[1656]: true Jan 14 01:40:21.703759 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 14 01:40:21.705409 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 14 01:40:21.706343 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 14 01:40:21.706658 systemd[1]: motdgen.service: Deactivated successfully. Jan 14 01:40:21.709168 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 14 01:40:21.712551 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 14 01:40:21.712818 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 14 01:40:21.730205 extend-filesystems[1643]: Resized partition /dev/vda9 Jan 14 01:40:21.736847 extend-filesystems[1689]: resize2fs 1.47.3 (8-Jul-2025) Jan 14 01:40:21.746732 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Jan 14 01:40:21.746792 jq[1675]: true Jan 14 01:40:21.749189 update_engine[1654]: I20260114 01:40:21.748896 1654 main.cc:92] Flatcar Update Engine starting Jan 14 01:40:21.763998 tar[1671]: linux-arm64/LICENSE Jan 14 01:40:21.765972 tar[1671]: linux-arm64/helm Jan 14 01:40:21.781363 dbus-daemon[1638]: [system] SELinux support is enabled Jan 14 01:40:21.781616 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 14 01:40:21.785500 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 14 01:40:21.785531 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 14 01:40:21.787900 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 14 01:40:21.787924 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 14 01:40:21.793934 systemd[1]: Started update-engine.service - Update Engine. Jan 14 01:40:21.796512 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 14 01:40:21.800369 update_engine[1654]: I20260114 01:40:21.796305 1654 update_check_scheduler.cc:74] Next update check in 7m7s Jan 14 01:40:21.823661 systemd-logind[1651]: New seat seat0. Jan 14 01:40:21.863273 systemd-logind[1651]: Watching system buttons on /dev/input/event0 (Power Button) Jan 14 01:40:21.863301 systemd-logind[1651]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jan 14 01:40:21.863556 systemd[1]: Started systemd-logind.service - User Login Management. Jan 14 01:40:21.873111 locksmithd[1706]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 14 01:40:21.902654 bash[1707]: Updated "/home/core/.ssh/authorized_keys" Jan 14 01:40:21.906206 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 14 01:40:21.909947 systemd[1]: Starting sshkeys.service... Jan 14 01:40:21.922636 containerd[1676]: time="2026-01-14T01:40:21Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 14 01:40:21.925830 containerd[1676]: time="2026-01-14T01:40:21.925775160Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 14 01:40:21.931330 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 14 01:40:21.935265 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 14 01:40:21.941730 containerd[1676]: time="2026-01-14T01:40:21.939066200Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.52µs" Jan 14 01:40:21.941730 containerd[1676]: time="2026-01-14T01:40:21.939105840Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 14 01:40:21.941730 containerd[1676]: time="2026-01-14T01:40:21.939146560Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 14 01:40:21.941730 containerd[1676]: time="2026-01-14T01:40:21.939157760Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 14 01:40:21.941730 containerd[1676]: time="2026-01-14T01:40:21.939297240Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 14 01:40:21.941730 containerd[1676]: time="2026-01-14T01:40:21.939313160Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 14 01:40:21.941730 containerd[1676]: time="2026-01-14T01:40:21.939360440Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 14 01:40:21.941730 containerd[1676]: time="2026-01-14T01:40:21.939370920Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 14 01:40:21.941730 containerd[1676]: time="2026-01-14T01:40:21.939647920Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 14 01:40:21.941730 containerd[1676]: time="2026-01-14T01:40:21.939664400Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 14 01:40:21.941730 containerd[1676]: time="2026-01-14T01:40:21.939675400Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 14 01:40:21.941730 containerd[1676]: time="2026-01-14T01:40:21.939683240Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 14 01:40:21.941990 containerd[1676]: time="2026-01-14T01:40:21.939852880Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 14 01:40:21.941990 containerd[1676]: time="2026-01-14T01:40:21.939868080Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 14 01:40:21.942096 containerd[1676]: time="2026-01-14T01:40:21.939933440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 14 01:40:21.942317 containerd[1676]: time="2026-01-14T01:40:21.942292280Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 14 01:40:21.942456 containerd[1676]: time="2026-01-14T01:40:21.942330840Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 14 01:40:21.942456 containerd[1676]: time="2026-01-14T01:40:21.942343560Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 14 01:40:21.942456 containerd[1676]: time="2026-01-14T01:40:21.942378920Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 14 01:40:21.942763 containerd[1676]: time="2026-01-14T01:40:21.942580920Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 14 01:40:21.942763 containerd[1676]: time="2026-01-14T01:40:21.942648640Z" level=info msg="metadata content store policy set" policy=shared Jan 14 01:40:21.954303 sshd_keygen[1667]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 14 01:40:21.960745 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:40:21.976783 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 14 01:40:21.978204 containerd[1676]: time="2026-01-14T01:40:21.978141880Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 14 01:40:21.978435 containerd[1676]: time="2026-01-14T01:40:21.978384920Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 14 01:40:21.979760 containerd[1676]: time="2026-01-14T01:40:21.978763360Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 14 01:40:21.979760 containerd[1676]: time="2026-01-14T01:40:21.978786240Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 14 01:40:21.979760 containerd[1676]: time="2026-01-14T01:40:21.978804920Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 14 01:40:21.979760 containerd[1676]: time="2026-01-14T01:40:21.978817440Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 14 01:40:21.979760 containerd[1676]: time="2026-01-14T01:40:21.978829440Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 14 01:40:21.979760 containerd[1676]: time="2026-01-14T01:40:21.978840160Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 14 01:40:21.979760 containerd[1676]: time="2026-01-14T01:40:21.978851920Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 14 01:40:21.979760 containerd[1676]: time="2026-01-14T01:40:21.978864680Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 14 01:40:21.979760 containerd[1676]: time="2026-01-14T01:40:21.978876160Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 14 01:40:21.979760 containerd[1676]: time="2026-01-14T01:40:21.978886320Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 14 01:40:21.979760 containerd[1676]: time="2026-01-14T01:40:21.978896960Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 14 01:40:21.979760 containerd[1676]: time="2026-01-14T01:40:21.978909040Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 14 01:40:21.979760 containerd[1676]: time="2026-01-14T01:40:21.979042800Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 14 01:40:21.980195 containerd[1676]: time="2026-01-14T01:40:21.979063000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 14 01:40:21.980195 containerd[1676]: time="2026-01-14T01:40:21.979077800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 14 01:40:21.980195 containerd[1676]: time="2026-01-14T01:40:21.979106160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 14 01:40:21.980195 containerd[1676]: time="2026-01-14T01:40:21.979119800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 14 01:40:21.980195 containerd[1676]: time="2026-01-14T01:40:21.979131800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 14 01:40:21.980195 containerd[1676]: time="2026-01-14T01:40:21.979143880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 14 01:40:21.980195 containerd[1676]: time="2026-01-14T01:40:21.979153760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 14 01:40:21.980195 containerd[1676]: time="2026-01-14T01:40:21.979166560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 14 01:40:21.980195 containerd[1676]: time="2026-01-14T01:40:21.979176880Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 14 01:40:21.980195 containerd[1676]: time="2026-01-14T01:40:21.979187320Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 14 01:40:21.980195 containerd[1676]: time="2026-01-14T01:40:21.979213960Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 14 01:40:21.980195 containerd[1676]: time="2026-01-14T01:40:21.979260800Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 14 01:40:21.980195 containerd[1676]: time="2026-01-14T01:40:21.979277600Z" level=info msg="Start snapshots syncer" Jan 14 01:40:21.980195 containerd[1676]: time="2026-01-14T01:40:21.979313840Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 14 01:40:21.980436 containerd[1676]: time="2026-01-14T01:40:21.979542880Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 14 01:40:21.980436 containerd[1676]: time="2026-01-14T01:40:21.979592320Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 14 01:40:21.980545 containerd[1676]: time="2026-01-14T01:40:21.979645680Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 14 01:40:21.980545 containerd[1676]: time="2026-01-14T01:40:21.980288160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 14 01:40:21.980545 containerd[1676]: time="2026-01-14T01:40:21.980318120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 14 01:40:21.980545 containerd[1676]: time="2026-01-14T01:40:21.980332440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 14 01:40:21.980545 containerd[1676]: time="2026-01-14T01:40:21.980343160Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 14 01:40:21.980545 containerd[1676]: time="2026-01-14T01:40:21.980354400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 14 01:40:21.980545 containerd[1676]: time="2026-01-14T01:40:21.980402440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 14 01:40:21.980545 containerd[1676]: time="2026-01-14T01:40:21.980419280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 14 01:40:21.980545 containerd[1676]: time="2026-01-14T01:40:21.980430720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 14 01:40:21.980545 containerd[1676]: time="2026-01-14T01:40:21.980441880Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 14 01:40:21.980545 containerd[1676]: time="2026-01-14T01:40:21.980488600Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 14 01:40:21.980545 containerd[1676]: time="2026-01-14T01:40:21.980504960Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 14 01:40:21.980545 containerd[1676]: time="2026-01-14T01:40:21.980513760Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 14 01:40:21.980821 containerd[1676]: time="2026-01-14T01:40:21.980522960Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 14 01:40:21.980821 containerd[1676]: time="2026-01-14T01:40:21.980704800Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 14 01:40:21.980821 containerd[1676]: time="2026-01-14T01:40:21.980745160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 14 01:40:21.980821 containerd[1676]: time="2026-01-14T01:40:21.980759480Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 14 01:40:21.980897 containerd[1676]: time="2026-01-14T01:40:21.980884960Z" level=info msg="runtime interface created" Jan 14 01:40:21.980897 containerd[1676]: time="2026-01-14T01:40:21.980895560Z" level=info msg="created NRI interface" Jan 14 01:40:21.980932 containerd[1676]: time="2026-01-14T01:40:21.980904720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 14 01:40:21.980932 containerd[1676]: time="2026-01-14T01:40:21.980918080Z" level=info msg="Connect containerd service" Jan 14 01:40:21.980965 containerd[1676]: time="2026-01-14T01:40:21.980941040Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 14 01:40:21.981117 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 14 01:40:21.982602 containerd[1676]: time="2026-01-14T01:40:21.982390360Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 14 01:40:22.000026 systemd[1]: issuegen.service: Deactivated successfully. Jan 14 01:40:22.001791 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 14 01:40:22.006112 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 14 01:40:22.032101 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 14 01:40:22.037091 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 14 01:40:22.042529 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jan 14 01:40:22.044153 systemd[1]: Reached target getty.target - Login Prompts. Jan 14 01:40:22.070770 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Jan 14 01:40:22.090321 extend-filesystems[1689]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 14 01:40:22.090321 extend-filesystems[1689]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 14 01:40:22.090321 extend-filesystems[1689]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Jan 14 01:40:22.094819 extend-filesystems[1643]: Resized filesystem in /dev/vda9 Jan 14 01:40:22.093634 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 14 01:40:22.093924 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 14 01:40:22.110741 containerd[1676]: time="2026-01-14T01:40:22.110275880Z" level=info msg="Start subscribing containerd event" Jan 14 01:40:22.110741 containerd[1676]: time="2026-01-14T01:40:22.110353360Z" level=info msg="Start recovering state" Jan 14 01:40:22.110741 containerd[1676]: time="2026-01-14T01:40:22.110446400Z" level=info msg="Start event monitor" Jan 14 01:40:22.110741 containerd[1676]: time="2026-01-14T01:40:22.110458320Z" level=info msg="Start cni network conf syncer for default" Jan 14 01:40:22.110741 containerd[1676]: time="2026-01-14T01:40:22.110465680Z" level=info msg="Start streaming server" Jan 14 01:40:22.110741 containerd[1676]: time="2026-01-14T01:40:22.110472920Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 14 01:40:22.110741 containerd[1676]: time="2026-01-14T01:40:22.110479440Z" level=info msg="runtime interface starting up..." Jan 14 01:40:22.110741 containerd[1676]: time="2026-01-14T01:40:22.110484640Z" level=info msg="starting plugins..." Jan 14 01:40:22.110741 containerd[1676]: time="2026-01-14T01:40:22.110496600Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 14 01:40:22.111316 containerd[1676]: time="2026-01-14T01:40:22.111185880Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 14 01:40:22.111489 containerd[1676]: time="2026-01-14T01:40:22.111470240Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 14 01:40:22.111894 containerd[1676]: time="2026-01-14T01:40:22.111872000Z" level=info msg="containerd successfully booted in 0.189762s" Jan 14 01:40:22.112062 systemd[1]: Started containerd.service - containerd container runtime. Jan 14 01:40:22.180923 tar[1671]: linux-arm64/README.md Jan 14 01:40:22.200670 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 14 01:40:22.307987 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 14 01:40:22.310360 systemd[1]: Started sshd@0-10.0.35.206:22-4.153.228.146:50752.service - OpenSSH per-connection server daemon (4.153.228.146:50752). Jan 14 01:40:22.670755 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:40:22.865298 sshd[1760]: Accepted publickey for core from 4.153.228.146 port 50752 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:40:22.867454 sshd-session[1760]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:40:22.876705 systemd-logind[1651]: New session 1 of user core. Jan 14 01:40:22.877960 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 14 01:40:22.880067 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 14 01:40:22.909752 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 14 01:40:22.913095 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 14 01:40:22.937956 (systemd)[1767]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:40:22.940321 systemd-logind[1651]: New session 2 of user core. Jan 14 01:40:22.973751 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:40:23.042966 systemd-networkd[1586]: eth0: Gained IPv6LL Jan 14 01:40:23.045423 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 14 01:40:23.048900 systemd[1]: Reached target network-online.target - Network is Online. Jan 14 01:40:23.051374 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:40:23.053486 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 14 01:40:23.066316 systemd[1767]: Queued start job for default target default.target. Jan 14 01:40:23.067567 systemd[1767]: Created slice app.slice - User Application Slice. Jan 14 01:40:23.067688 systemd[1767]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 14 01:40:23.067794 systemd[1767]: Reached target paths.target - Paths. Jan 14 01:40:23.067936 systemd[1767]: Reached target timers.target - Timers. Jan 14 01:40:23.071837 systemd[1767]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 14 01:40:23.072691 systemd[1767]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 14 01:40:23.084781 systemd[1767]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 14 01:40:23.086963 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 14 01:40:23.087396 systemd[1767]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 14 01:40:23.087508 systemd[1767]: Reached target sockets.target - Sockets. Jan 14 01:40:23.087548 systemd[1767]: Reached target basic.target - Basic System. Jan 14 01:40:23.087577 systemd[1767]: Reached target default.target - Main User Target. Jan 14 01:40:23.087601 systemd[1767]: Startup finished in 142ms. Jan 14 01:40:23.088769 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 14 01:40:23.098071 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 14 01:40:23.407404 systemd[1]: Started sshd@1-10.0.35.206:22-4.153.228.146:50754.service - OpenSSH per-connection server daemon (4.153.228.146:50754). Jan 14 01:40:23.931109 sshd[1794]: Accepted publickey for core from 4.153.228.146 port 50754 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:40:23.932371 sshd-session[1794]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:40:23.938234 systemd-logind[1651]: New session 3 of user core. Jan 14 01:40:23.947103 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 14 01:40:23.955887 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:40:23.960026 (kubelet)[1804]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:40:24.225558 sshd[1802]: Connection closed by 4.153.228.146 port 50754 Jan 14 01:40:24.225270 sshd-session[1794]: pam_unix(sshd:session): session closed for user core Jan 14 01:40:24.230772 systemd[1]: sshd@1-10.0.35.206:22-4.153.228.146:50754.service: Deactivated successfully. Jan 14 01:40:24.232933 systemd[1]: session-3.scope: Deactivated successfully. Jan 14 01:40:24.236556 systemd-logind[1651]: Session 3 logged out. Waiting for processes to exit. Jan 14 01:40:24.237316 systemd-logind[1651]: Removed session 3. Jan 14 01:40:24.332592 systemd[1]: Started sshd@2-10.0.35.206:22-4.153.228.146:59138.service - OpenSSH per-connection server daemon (4.153.228.146:59138). Jan 14 01:40:24.604574 kubelet[1804]: E0114 01:40:24.604499 1804 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:40:24.607142 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:40:24.607376 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:40:24.607792 systemd[1]: kubelet.service: Consumed 801ms CPU time, 257.9M memory peak. Jan 14 01:40:24.683784 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:40:24.853395 sshd[1815]: Accepted publickey for core from 4.153.228.146 port 59138 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:40:24.854764 sshd-session[1815]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:40:24.858777 systemd-logind[1651]: New session 4 of user core. Jan 14 01:40:24.869656 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 14 01:40:24.981760 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:40:25.147021 sshd[1822]: Connection closed by 4.153.228.146 port 59138 Jan 14 01:40:25.146988 sshd-session[1815]: pam_unix(sshd:session): session closed for user core Jan 14 01:40:25.151688 systemd[1]: sshd@2-10.0.35.206:22-4.153.228.146:59138.service: Deactivated successfully. Jan 14 01:40:25.153304 systemd[1]: session-4.scope: Deactivated successfully. Jan 14 01:40:25.154162 systemd-logind[1651]: Session 4 logged out. Waiting for processes to exit. Jan 14 01:40:25.155271 systemd-logind[1651]: Removed session 4. Jan 14 01:40:28.695785 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:40:28.700960 coreos-metadata[1637]: Jan 14 01:40:28.700 WARN failed to locate config-drive, using the metadata service API instead Jan 14 01:40:28.717005 coreos-metadata[1637]: Jan 14 01:40:28.716 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 14 01:40:28.991401 coreos-metadata[1637]: Jan 14 01:40:28.991 INFO Fetch successful Jan 14 01:40:28.991664 coreos-metadata[1637]: Jan 14 01:40:28.991 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 14 01:40:28.991765 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 14 01:40:28.999452 coreos-metadata[1721]: Jan 14 01:40:28.999 WARN failed to locate config-drive, using the metadata service API instead Jan 14 01:40:29.012547 coreos-metadata[1721]: Jan 14 01:40:29.012 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 14 01:40:29.265055 coreos-metadata[1637]: Jan 14 01:40:29.264 INFO Fetch successful Jan 14 01:40:29.265055 coreos-metadata[1637]: Jan 14 01:40:29.264 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 14 01:40:29.267075 coreos-metadata[1721]: Jan 14 01:40:29.267 INFO Fetch successful Jan 14 01:40:29.267075 coreos-metadata[1721]: Jan 14 01:40:29.267 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 14 01:40:29.528532 coreos-metadata[1637]: Jan 14 01:40:29.528 INFO Fetch successful Jan 14 01:40:29.528532 coreos-metadata[1637]: Jan 14 01:40:29.528 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 14 01:40:29.626932 coreos-metadata[1721]: Jan 14 01:40:29.626 INFO Fetch successful Jan 14 01:40:29.629454 unknown[1721]: wrote ssh authorized keys file for user: core Jan 14 01:40:29.663953 update-ssh-keys[1837]: Updated "/home/core/.ssh/authorized_keys" Jan 14 01:40:29.664912 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 14 01:40:29.666561 systemd[1]: Finished sshkeys.service. Jan 14 01:40:29.670222 coreos-metadata[1637]: Jan 14 01:40:29.670 INFO Fetch successful Jan 14 01:40:29.670222 coreos-metadata[1637]: Jan 14 01:40:29.670 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 14 01:40:29.809395 coreos-metadata[1637]: Jan 14 01:40:29.809 INFO Fetch successful Jan 14 01:40:29.809395 coreos-metadata[1637]: Jan 14 01:40:29.809 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 14 01:40:29.945513 coreos-metadata[1637]: Jan 14 01:40:29.945 INFO Fetch successful Jan 14 01:40:29.971548 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 14 01:40:29.972007 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 14 01:40:29.972159 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 14 01:40:29.974793 systemd[1]: Startup finished in 2.786s (kernel) + 11.954s (initrd) + 10.910s (userspace) = 25.652s. Jan 14 01:40:34.759186 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 14 01:40:34.760701 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:40:34.913624 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:40:34.917418 (kubelet)[1853]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:40:34.947145 kubelet[1853]: E0114 01:40:34.947079 1853 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:40:34.950412 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:40:34.950548 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:40:34.951143 systemd[1]: kubelet.service: Consumed 138ms CPU time, 105.2M memory peak. Jan 14 01:40:35.261282 systemd[1]: Started sshd@3-10.0.35.206:22-4.153.228.146:38370.service - OpenSSH per-connection server daemon (4.153.228.146:38370). Jan 14 01:40:35.817810 sshd[1862]: Accepted publickey for core from 4.153.228.146 port 38370 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:40:35.819138 sshd-session[1862]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:40:35.824788 systemd-logind[1651]: New session 5 of user core. Jan 14 01:40:35.830045 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 14 01:40:36.120850 sshd[1866]: Connection closed by 4.153.228.146 port 38370 Jan 14 01:40:36.121189 sshd-session[1862]: pam_unix(sshd:session): session closed for user core Jan 14 01:40:36.125370 systemd[1]: sshd@3-10.0.35.206:22-4.153.228.146:38370.service: Deactivated successfully. Jan 14 01:40:36.127113 systemd[1]: session-5.scope: Deactivated successfully. Jan 14 01:40:36.127861 systemd-logind[1651]: Session 5 logged out. Waiting for processes to exit. Jan 14 01:40:36.128655 systemd-logind[1651]: Removed session 5. Jan 14 01:40:36.234923 systemd[1]: Started sshd@4-10.0.35.206:22-4.153.228.146:38372.service - OpenSSH per-connection server daemon (4.153.228.146:38372). Jan 14 01:40:36.793701 sshd[1872]: Accepted publickey for core from 4.153.228.146 port 38372 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:40:36.794637 sshd-session[1872]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:40:36.798614 systemd-logind[1651]: New session 6 of user core. Jan 14 01:40:36.808058 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 14 01:40:37.092829 sshd[1876]: Connection closed by 4.153.228.146 port 38372 Jan 14 01:40:37.092585 sshd-session[1872]: pam_unix(sshd:session): session closed for user core Jan 14 01:40:37.096646 systemd[1]: sshd@4-10.0.35.206:22-4.153.228.146:38372.service: Deactivated successfully. Jan 14 01:40:37.098246 systemd[1]: session-6.scope: Deactivated successfully. Jan 14 01:40:37.099358 systemd-logind[1651]: Session 6 logged out. Waiting for processes to exit. Jan 14 01:40:37.100178 systemd-logind[1651]: Removed session 6. Jan 14 01:40:37.200749 systemd[1]: Started sshd@5-10.0.35.206:22-4.153.228.146:38380.service - OpenSSH per-connection server daemon (4.153.228.146:38380). Jan 14 01:40:37.761771 sshd[1882]: Accepted publickey for core from 4.153.228.146 port 38380 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:40:37.762978 sshd-session[1882]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:40:37.767671 systemd-logind[1651]: New session 7 of user core. Jan 14 01:40:37.774889 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 14 01:40:38.063928 sshd[1886]: Connection closed by 4.153.228.146 port 38380 Jan 14 01:40:38.064272 sshd-session[1882]: pam_unix(sshd:session): session closed for user core Jan 14 01:40:38.068162 systemd[1]: sshd@5-10.0.35.206:22-4.153.228.146:38380.service: Deactivated successfully. Jan 14 01:40:38.069873 systemd[1]: session-7.scope: Deactivated successfully. Jan 14 01:40:38.070750 systemd-logind[1651]: Session 7 logged out. Waiting for processes to exit. Jan 14 01:40:38.071879 systemd-logind[1651]: Removed session 7. Jan 14 01:40:38.174182 systemd[1]: Started sshd@6-10.0.35.206:22-4.153.228.146:38396.service - OpenSSH per-connection server daemon (4.153.228.146:38396). Jan 14 01:40:38.702615 sshd[1892]: Accepted publickey for core from 4.153.228.146 port 38396 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:40:38.703994 sshd-session[1892]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:40:38.708611 systemd-logind[1651]: New session 8 of user core. Jan 14 01:40:38.719057 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 14 01:40:38.916542 sudo[1897]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 14 01:40:38.916849 sudo[1897]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 01:40:38.926964 sudo[1897]: pam_unix(sudo:session): session closed for user root Jan 14 01:40:39.023370 sshd[1896]: Connection closed by 4.153.228.146 port 38396 Jan 14 01:40:39.023820 sshd-session[1892]: pam_unix(sshd:session): session closed for user core Jan 14 01:40:39.028390 systemd[1]: sshd@6-10.0.35.206:22-4.153.228.146:38396.service: Deactivated successfully. Jan 14 01:40:39.030310 systemd[1]: session-8.scope: Deactivated successfully. Jan 14 01:40:39.031334 systemd-logind[1651]: Session 8 logged out. Waiting for processes to exit. Jan 14 01:40:39.032749 systemd-logind[1651]: Removed session 8. Jan 14 01:40:39.136114 systemd[1]: Started sshd@7-10.0.35.206:22-4.153.228.146:38400.service - OpenSSH per-connection server daemon (4.153.228.146:38400). Jan 14 01:40:39.673938 sshd[1904]: Accepted publickey for core from 4.153.228.146 port 38400 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:40:39.675287 sshd-session[1904]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:40:39.679387 systemd-logind[1651]: New session 9 of user core. Jan 14 01:40:39.694390 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 14 01:40:39.873126 sudo[1910]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 14 01:40:39.873385 sudo[1910]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 01:40:39.875708 sudo[1910]: pam_unix(sudo:session): session closed for user root Jan 14 01:40:39.881527 sudo[1909]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 14 01:40:39.881804 sudo[1909]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 01:40:39.889205 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 14 01:40:39.931000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 14 01:40:39.932553 augenrules[1934]: No rules Jan 14 01:40:39.932880 kernel: kauditd_printk_skb: 186 callbacks suppressed Jan 14 01:40:39.932918 kernel: audit: type=1305 audit(1768354839.931:230): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 14 01:40:39.934834 systemd[1]: audit-rules.service: Deactivated successfully. Jan 14 01:40:39.935056 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 14 01:40:39.931000 audit[1934]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffe3107af0 a2=420 a3=0 items=0 ppid=1915 pid=1934 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:39.936567 sudo[1909]: pam_unix(sudo:session): session closed for user root Jan 14 01:40:39.939312 kernel: audit: type=1300 audit(1768354839.931:230): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffe3107af0 a2=420 a3=0 items=0 ppid=1915 pid=1934 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:39.939392 kernel: audit: type=1327 audit(1768354839.931:230): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 14 01:40:39.931000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 14 01:40:39.934000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:39.943493 kernel: audit: type=1130 audit(1768354839.934:231): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:39.943541 kernel: audit: type=1131 audit(1768354839.934:232): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:39.934000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:39.946041 kernel: audit: type=1106 audit(1768354839.935:233): pid=1909 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:40:39.935000 audit[1909]: USER_END pid=1909 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:40:39.935000 audit[1909]: CRED_DISP pid=1909 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:40:39.955224 kernel: audit: type=1104 audit(1768354839.935:234): pid=1909 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:40:40.037823 sshd[1908]: Connection closed by 4.153.228.146 port 38400 Jan 14 01:40:40.038382 sshd-session[1904]: pam_unix(sshd:session): session closed for user core Jan 14 01:40:40.038000 audit[1904]: USER_END pid=1904 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:40:40.042399 systemd[1]: sshd@7-10.0.35.206:22-4.153.228.146:38400.service: Deactivated successfully. Jan 14 01:40:40.039000 audit[1904]: CRED_DISP pid=1904 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:40:40.043966 systemd[1]: session-9.scope: Deactivated successfully. Jan 14 01:40:40.045519 systemd-logind[1651]: Session 9 logged out. Waiting for processes to exit. Jan 14 01:40:40.046819 systemd-logind[1651]: Removed session 9. Jan 14 01:40:40.047730 kernel: audit: type=1106 audit(1768354840.038:235): pid=1904 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:40:40.047794 kernel: audit: type=1104 audit(1768354840.039:236): pid=1904 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:40:40.047821 kernel: audit: type=1131 audit(1768354840.041:237): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.35.206:22-4.153.228.146:38400 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:40.041000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.35.206:22-4.153.228.146:38400 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:40.154000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.35.206:22-4.153.228.146:38402 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:40.154913 systemd[1]: Started sshd@8-10.0.35.206:22-4.153.228.146:38402.service - OpenSSH per-connection server daemon (4.153.228.146:38402). Jan 14 01:40:40.690000 audit[1943]: USER_ACCT pid=1943 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:40:40.690955 sshd[1943]: Accepted publickey for core from 4.153.228.146 port 38402 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:40:40.691000 audit[1943]: CRED_ACQ pid=1943 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:40:40.691000 audit[1943]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffde2d5620 a2=3 a3=0 items=0 ppid=1 pid=1943 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:40.691000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:40:40.692194 sshd-session[1943]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:40:40.695862 systemd-logind[1651]: New session 10 of user core. Jan 14 01:40:40.710097 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 14 01:40:40.711000 audit[1943]: USER_START pid=1943 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:40:40.713000 audit[1947]: CRED_ACQ pid=1947 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:40:40.895000 audit[1948]: USER_ACCT pid=1948 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:40:40.896113 sudo[1948]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 14 01:40:40.895000 audit[1948]: CRED_REFR pid=1948 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:40:40.895000 audit[1948]: USER_START pid=1948 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:40:40.896382 sudo[1948]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 01:40:41.224387 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 14 01:40:41.246343 (dockerd)[1969]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 14 01:40:41.488763 dockerd[1969]: time="2026-01-14T01:40:41.488615560Z" level=info msg="Starting up" Jan 14 01:40:41.489629 dockerd[1969]: time="2026-01-14T01:40:41.489584480Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 14 01:40:41.500077 dockerd[1969]: time="2026-01-14T01:40:41.500032040Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 14 01:40:41.536193 dockerd[1969]: time="2026-01-14T01:40:41.536076200Z" level=info msg="Loading containers: start." Jan 14 01:40:41.545752 kernel: Initializing XFRM netlink socket Jan 14 01:40:41.594000 audit[2020]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2020 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.594000 audit[2020]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffd9a3e5f0 a2=0 a3=0 items=0 ppid=1969 pid=2020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.594000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 14 01:40:41.596000 audit[2022]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2022 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.596000 audit[2022]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffe17c5120 a2=0 a3=0 items=0 ppid=1969 pid=2022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.596000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 14 01:40:41.597000 audit[2024]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2024 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.597000 audit[2024]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff1b5f640 a2=0 a3=0 items=0 ppid=1969 pid=2024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.597000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 14 01:40:41.599000 audit[2026]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2026 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.599000 audit[2026]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff0a5e9b0 a2=0 a3=0 items=0 ppid=1969 pid=2026 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.599000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 14 01:40:41.601000 audit[2028]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2028 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.601000 audit[2028]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffe9e7acd0 a2=0 a3=0 items=0 ppid=1969 pid=2028 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.601000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 14 01:40:41.602000 audit[2030]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2030 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.602000 audit[2030]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffe2366060 a2=0 a3=0 items=0 ppid=1969 pid=2030 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.602000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 01:40:41.604000 audit[2032]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2032 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.604000 audit[2032]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffc7e55bb0 a2=0 a3=0 items=0 ppid=1969 pid=2032 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.604000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 14 01:40:41.606000 audit[2034]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2034 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.606000 audit[2034]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=fffff5388970 a2=0 a3=0 items=0 ppid=1969 pid=2034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.606000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 14 01:40:41.638000 audit[2037]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2037 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.638000 audit[2037]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=472 a0=3 a1=ffffeba3dca0 a2=0 a3=0 items=0 ppid=1969 pid=2037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.638000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 14 01:40:41.640000 audit[2039]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2039 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.640000 audit[2039]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffe1471900 a2=0 a3=0 items=0 ppid=1969 pid=2039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.640000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 14 01:40:41.642000 audit[2041]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2041 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.642000 audit[2041]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffe1506480 a2=0 a3=0 items=0 ppid=1969 pid=2041 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.642000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 14 01:40:41.643000 audit[2043]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2043 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.643000 audit[2043]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffe3d6aca0 a2=0 a3=0 items=0 ppid=1969 pid=2043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.643000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 01:40:41.645000 audit[2045]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2045 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.645000 audit[2045]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffc60c7940 a2=0 a3=0 items=0 ppid=1969 pid=2045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.645000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 14 01:40:41.680000 audit[2075]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2075 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:40:41.680000 audit[2075]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffcee5c5c0 a2=0 a3=0 items=0 ppid=1969 pid=2075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.680000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 14 01:40:41.681000 audit[2077]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2077 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:40:41.681000 audit[2077]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffedd9f870 a2=0 a3=0 items=0 ppid=1969 pid=2077 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.681000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 14 01:40:41.684000 audit[2079]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2079 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:40:41.684000 audit[2079]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe9ac0bf0 a2=0 a3=0 items=0 ppid=1969 pid=2079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.684000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 14 01:40:41.686000 audit[2081]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2081 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:40:41.686000 audit[2081]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffda9ab350 a2=0 a3=0 items=0 ppid=1969 pid=2081 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.686000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 14 01:40:41.687000 audit[2083]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2083 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:40:41.687000 audit[2083]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffed31d330 a2=0 a3=0 items=0 ppid=1969 pid=2083 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.687000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 14 01:40:41.689000 audit[2085]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2085 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:40:41.689000 audit[2085]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffc722eef0 a2=0 a3=0 items=0 ppid=1969 pid=2085 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.689000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 01:40:41.691000 audit[2087]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2087 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:40:41.691000 audit[2087]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffd64578d0 a2=0 a3=0 items=0 ppid=1969 pid=2087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.691000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 14 01:40:41.693000 audit[2089]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2089 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:40:41.693000 audit[2089]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffe9bf48b0 a2=0 a3=0 items=0 ppid=1969 pid=2089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.693000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 14 01:40:41.695000 audit[2091]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2091 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:40:41.695000 audit[2091]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=484 a0=3 a1=ffffc8c28eb0 a2=0 a3=0 items=0 ppid=1969 pid=2091 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.695000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 14 01:40:41.696000 audit[2093]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2093 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:40:41.696000 audit[2093]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffc46bb6d0 a2=0 a3=0 items=0 ppid=1969 pid=2093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.696000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 14 01:40:41.698000 audit[2095]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2095 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:40:41.698000 audit[2095]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffdc0efba0 a2=0 a3=0 items=0 ppid=1969 pid=2095 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.698000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 14 01:40:41.700000 audit[2097]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2097 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:40:41.700000 audit[2097]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffdaebcec0 a2=0 a3=0 items=0 ppid=1969 pid=2097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.700000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 01:40:41.702000 audit[2099]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2099 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:40:41.702000 audit[2099]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffc4324330 a2=0 a3=0 items=0 ppid=1969 pid=2099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.702000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 14 01:40:41.707000 audit[2104]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2104 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.707000 audit[2104]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd2d8c9f0 a2=0 a3=0 items=0 ppid=1969 pid=2104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.707000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 14 01:40:41.709000 audit[2106]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2106 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.709000 audit[2106]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffe9fe7b30 a2=0 a3=0 items=0 ppid=1969 pid=2106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.709000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 14 01:40:41.710000 audit[2108]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2108 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.710000 audit[2108]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffc0e92d70 a2=0 a3=0 items=0 ppid=1969 pid=2108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.710000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 14 01:40:41.712000 audit[2110]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2110 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:40:41.712000 audit[2110]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffe3718f20 a2=0 a3=0 items=0 ppid=1969 pid=2110 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.712000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 14 01:40:41.714000 audit[2112]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2112 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:40:41.714000 audit[2112]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffcf846240 a2=0 a3=0 items=0 ppid=1969 pid=2112 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.714000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 14 01:40:41.716000 audit[2114]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2114 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:40:41.716000 audit[2114]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffc471e830 a2=0 a3=0 items=0 ppid=1969 pid=2114 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.716000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 14 01:40:41.734000 audit[2119]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2119 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.734000 audit[2119]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=520 a0=3 a1=fffff4f54070 a2=0 a3=0 items=0 ppid=1969 pid=2119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.734000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 14 01:40:41.736000 audit[2121]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2121 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.736000 audit[2121]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=ffffed306d80 a2=0 a3=0 items=0 ppid=1969 pid=2121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.736000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 14 01:40:41.743000 audit[2129]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2129 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.743000 audit[2129]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=300 a0=3 a1=fffffb8d5c20 a2=0 a3=0 items=0 ppid=1969 pid=2129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.743000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 14 01:40:41.754000 audit[2135]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2135 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.754000 audit[2135]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=fffff6b50150 a2=0 a3=0 items=0 ppid=1969 pid=2135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.754000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 14 01:40:41.756000 audit[2137]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2137 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.756000 audit[2137]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=512 a0=3 a1=ffffcb2daba0 a2=0 a3=0 items=0 ppid=1969 pid=2137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.756000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 14 01:40:41.758000 audit[2139]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2139 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.758000 audit[2139]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffe46e34f0 a2=0 a3=0 items=0 ppid=1969 pid=2139 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.758000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 14 01:40:41.760000 audit[2141]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2141 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.760000 audit[2141]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=ffffd96759b0 a2=0 a3=0 items=0 ppid=1969 pid=2141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.760000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 14 01:40:41.762000 audit[2143]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2143 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:40:41.762000 audit[2143]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffe92cc2e0 a2=0 a3=0 items=0 ppid=1969 pid=2143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:40:41.762000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 14 01:40:41.763335 systemd-networkd[1586]: docker0: Link UP Jan 14 01:40:41.767153 dockerd[1969]: time="2026-01-14T01:40:41.767108360Z" level=info msg="Loading containers: done." Jan 14 01:40:41.779609 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1334689495-merged.mount: Deactivated successfully. Jan 14 01:40:41.788354 dockerd[1969]: time="2026-01-14T01:40:41.788291760Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 14 01:40:41.788504 dockerd[1969]: time="2026-01-14T01:40:41.788382800Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 14 01:40:41.788573 dockerd[1969]: time="2026-01-14T01:40:41.788554520Z" level=info msg="Initializing buildkit" Jan 14 01:40:41.809830 dockerd[1969]: time="2026-01-14T01:40:41.809797360Z" level=info msg="Completed buildkit initialization" Jan 14 01:40:41.817452 dockerd[1969]: time="2026-01-14T01:40:41.817402680Z" level=info msg="Daemon has completed initialization" Jan 14 01:40:41.817809 dockerd[1969]: time="2026-01-14T01:40:41.817508200Z" level=info msg="API listen on /run/docker.sock" Jan 14 01:40:41.817684 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 14 01:40:41.817000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:42.964687 containerd[1676]: time="2026-01-14T01:40:42.964641960Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\"" Jan 14 01:40:43.593774 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3158060082.mount: Deactivated successfully. Jan 14 01:40:44.528413 containerd[1676]: time="2026-01-14T01:40:44.528336520Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:44.529532 containerd[1676]: time="2026-01-14T01:40:44.529281920Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.7: active requests=0, bytes read=25791094" Jan 14 01:40:44.530345 containerd[1676]: time="2026-01-14T01:40:44.530301160Z" level=info msg="ImageCreate event name:\"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:44.532902 containerd[1676]: time="2026-01-14T01:40:44.532868080Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:44.534713 containerd[1676]: time="2026-01-14T01:40:44.534553800Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.7\" with image id \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\", size \"27383880\" in 1.56986908s" Jan 14 01:40:44.534713 containerd[1676]: time="2026-01-14T01:40:44.534589080Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\" returns image reference \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\"" Jan 14 01:40:44.535982 containerd[1676]: time="2026-01-14T01:40:44.535958360Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\"" Jan 14 01:40:45.009001 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 14 01:40:45.010417 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:40:45.144750 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:40:45.144000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:45.145975 kernel: kauditd_printk_skb: 132 callbacks suppressed Jan 14 01:40:45.146042 kernel: audit: type=1130 audit(1768354845.144:288): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:45.149460 (kubelet)[2252]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:40:45.187413 kubelet[2252]: E0114 01:40:45.187352 2252 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:40:45.189950 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:40:45.190087 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:40:45.191000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 01:40:45.191975 systemd[1]: kubelet.service: Consumed 142ms CPU time, 107.6M memory peak. Jan 14 01:40:45.195774 kernel: audit: type=1131 audit(1768354845.191:289): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 01:40:45.474523 chronyd[1635]: Selected source PHC0 Jan 14 01:40:45.850451 containerd[1676]: time="2026-01-14T01:40:45.850381116Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:45.852142 containerd[1676]: time="2026-01-14T01:40:45.852100587Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.7: active requests=0, bytes read=23544927" Jan 14 01:40:45.853607 containerd[1676]: time="2026-01-14T01:40:45.853569070Z" level=info msg="ImageCreate event name:\"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:45.857550 containerd[1676]: time="2026-01-14T01:40:45.857504700Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:45.858771 containerd[1676]: time="2026-01-14T01:40:45.858716476Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.7\" with image id \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\", size \"25137562\" in 1.322727386s" Jan 14 01:40:45.858771 containerd[1676]: time="2026-01-14T01:40:45.858762193Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\" returns image reference \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\"" Jan 14 01:40:45.859583 containerd[1676]: time="2026-01-14T01:40:45.859522522Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\"" Jan 14 01:40:47.091823 containerd[1676]: time="2026-01-14T01:40:47.091771779Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:47.092856 containerd[1676]: time="2026-01-14T01:40:47.092821142Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.7: active requests=0, bytes read=18289931" Jan 14 01:40:47.094042 containerd[1676]: time="2026-01-14T01:40:47.094009042Z" level=info msg="ImageCreate event name:\"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:47.099361 containerd[1676]: time="2026-01-14T01:40:47.099319772Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:47.101271 containerd[1676]: time="2026-01-14T01:40:47.101209933Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.7\" with image id \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\", size \"19882566\" in 1.241651011s" Jan 14 01:40:47.101271 containerd[1676]: time="2026-01-14T01:40:47.101276363Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\" returns image reference \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\"" Jan 14 01:40:47.101775 containerd[1676]: time="2026-01-14T01:40:47.101747831Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\"" Jan 14 01:40:48.177172 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1176806000.mount: Deactivated successfully. Jan 14 01:40:48.502539 containerd[1676]: time="2026-01-14T01:40:48.502413638Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:48.503404 containerd[1676]: time="2026-01-14T01:40:48.503322253Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.7: active requests=0, bytes read=28254952" Jan 14 01:40:48.504272 containerd[1676]: time="2026-01-14T01:40:48.504243479Z" level=info msg="ImageCreate event name:\"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:48.506631 containerd[1676]: time="2026-01-14T01:40:48.506592065Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:48.507439 containerd[1676]: time="2026-01-14T01:40:48.507381817Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.7\" with image id \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\", repo tag \"registry.k8s.io/kube-proxy:v1.33.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\", size \"28257692\" in 1.405601616s" Jan 14 01:40:48.507533 containerd[1676]: time="2026-01-14T01:40:48.507507614Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\" returns image reference \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\"" Jan 14 01:40:48.508204 containerd[1676]: time="2026-01-14T01:40:48.508176335Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Jan 14 01:40:49.289657 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3033094408.mount: Deactivated successfully. Jan 14 01:40:50.009693 containerd[1676]: time="2026-01-14T01:40:50.009618572Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:50.010277 containerd[1676]: time="2026-01-14T01:40:50.010210873Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=18790854" Jan 14 01:40:50.011401 containerd[1676]: time="2026-01-14T01:40:50.011361997Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:50.015118 containerd[1676]: time="2026-01-14T01:40:50.015063600Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:50.016202 containerd[1676]: time="2026-01-14T01:40:50.016161805Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.50794922s" Jan 14 01:40:50.016249 containerd[1676]: time="2026-01-14T01:40:50.016206524Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Jan 14 01:40:50.016635 containerd[1676]: time="2026-01-14T01:40:50.016612311Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 14 01:40:50.555230 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount209848892.mount: Deactivated successfully. Jan 14 01:40:50.565475 containerd[1676]: time="2026-01-14T01:40:50.565407737Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 14 01:40:50.566278 containerd[1676]: time="2026-01-14T01:40:50.566221699Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 14 01:40:50.567270 containerd[1676]: time="2026-01-14T01:40:50.567220542Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 14 01:40:50.569613 containerd[1676]: time="2026-01-14T01:40:50.569561028Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 14 01:40:50.570333 containerd[1676]: time="2026-01-14T01:40:50.570295510Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 553.653439ms" Jan 14 01:40:50.570333 containerd[1676]: time="2026-01-14T01:40:50.570325990Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Jan 14 01:40:50.570915 containerd[1676]: time="2026-01-14T01:40:50.570854111Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Jan 14 01:40:51.367196 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3213042523.mount: Deactivated successfully. Jan 14 01:40:53.109750 containerd[1676]: time="2026-01-14T01:40:53.109682167Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:53.110456 containerd[1676]: time="2026-01-14T01:40:53.110394849Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=68134789" Jan 14 01:40:53.111693 containerd[1676]: time="2026-01-14T01:40:53.111657292Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:53.115509 containerd[1676]: time="2026-01-14T01:40:53.115437622Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:40:53.116547 containerd[1676]: time="2026-01-14T01:40:53.116490504Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 2.545602873s" Jan 14 01:40:53.116547 containerd[1676]: time="2026-01-14T01:40:53.116539624Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Jan 14 01:40:55.259015 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 14 01:40:55.260785 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:40:55.416693 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:40:55.416000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:55.420788 kernel: audit: type=1130 audit(1768354855.416:290): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:55.420853 (kubelet)[2419]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 01:40:55.453866 kubelet[2419]: E0114 01:40:55.453789 2419 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 01:40:55.456342 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 01:40:55.456477 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 01:40:55.456848 systemd[1]: kubelet.service: Consumed 135ms CPU time, 107.3M memory peak. Jan 14 01:40:55.456000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 01:40:55.460815 kernel: audit: type=1131 audit(1768354855.456:291): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 01:40:59.043390 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:40:59.043547 systemd[1]: kubelet.service: Consumed 135ms CPU time, 107.3M memory peak. Jan 14 01:40:59.041000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:59.048587 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:40:59.041000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:59.052127 kernel: audit: type=1130 audit(1768354859.041:292): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:59.052234 kernel: audit: type=1131 audit(1768354859.041:293): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:59.070502 systemd[1]: Reload requested from client PID 2435 ('systemctl') (unit session-10.scope)... Jan 14 01:40:59.070516 systemd[1]: Reloading... Jan 14 01:40:59.135791 zram_generator::config[2477]: No configuration found. Jan 14 01:40:59.317105 systemd[1]: Reloading finished in 246 ms. Jan 14 01:40:59.344000 audit: BPF prog-id=63 op=LOAD Jan 14 01:40:59.344000 audit: BPF prog-id=48 op=UNLOAD Jan 14 01:40:59.347742 kernel: audit: type=1334 audit(1768354859.344:294): prog-id=63 op=LOAD Jan 14 01:40:59.347790 kernel: audit: type=1334 audit(1768354859.344:295): prog-id=48 op=UNLOAD Jan 14 01:40:59.347807 kernel: audit: type=1334 audit(1768354859.344:296): prog-id=64 op=LOAD Jan 14 01:40:59.344000 audit: BPF prog-id=64 op=LOAD Jan 14 01:40:59.344000 audit: BPF prog-id=65 op=LOAD Jan 14 01:40:59.349643 kernel: audit: type=1334 audit(1768354859.344:297): prog-id=65 op=LOAD Jan 14 01:40:59.349675 kernel: audit: type=1334 audit(1768354859.344:298): prog-id=49 op=UNLOAD Jan 14 01:40:59.349698 kernel: audit: type=1334 audit(1768354859.344:299): prog-id=50 op=UNLOAD Jan 14 01:40:59.344000 audit: BPF prog-id=49 op=UNLOAD Jan 14 01:40:59.344000 audit: BPF prog-id=50 op=UNLOAD Jan 14 01:40:59.344000 audit: BPF prog-id=66 op=LOAD Jan 14 01:40:59.344000 audit: BPF prog-id=55 op=UNLOAD Jan 14 01:40:59.344000 audit: BPF prog-id=67 op=LOAD Jan 14 01:40:59.345000 audit: BPF prog-id=68 op=LOAD Jan 14 01:40:59.345000 audit: BPF prog-id=56 op=UNLOAD Jan 14 01:40:59.345000 audit: BPF prog-id=57 op=UNLOAD Jan 14 01:40:59.346000 audit: BPF prog-id=69 op=LOAD Jan 14 01:40:59.346000 audit: BPF prog-id=54 op=UNLOAD Jan 14 01:40:59.347000 audit: BPF prog-id=70 op=LOAD Jan 14 01:40:59.347000 audit: BPF prog-id=59 op=UNLOAD Jan 14 01:40:59.349000 audit: BPF prog-id=71 op=LOAD Jan 14 01:40:59.349000 audit: BPF prog-id=51 op=UNLOAD Jan 14 01:40:59.350000 audit: BPF prog-id=72 op=LOAD Jan 14 01:40:59.350000 audit: BPF prog-id=73 op=LOAD Jan 14 01:40:59.350000 audit: BPF prog-id=52 op=UNLOAD Jan 14 01:40:59.350000 audit: BPF prog-id=53 op=UNLOAD Jan 14 01:40:59.370000 audit: BPF prog-id=74 op=LOAD Jan 14 01:40:59.370000 audit: BPF prog-id=58 op=UNLOAD Jan 14 01:40:59.371000 audit: BPF prog-id=75 op=LOAD Jan 14 01:40:59.371000 audit: BPF prog-id=45 op=UNLOAD Jan 14 01:40:59.371000 audit: BPF prog-id=76 op=LOAD Jan 14 01:40:59.371000 audit: BPF prog-id=77 op=LOAD Jan 14 01:40:59.371000 audit: BPF prog-id=46 op=UNLOAD Jan 14 01:40:59.371000 audit: BPF prog-id=47 op=UNLOAD Jan 14 01:40:59.371000 audit: BPF prog-id=78 op=LOAD Jan 14 01:40:59.371000 audit: BPF prog-id=60 op=UNLOAD Jan 14 01:40:59.371000 audit: BPF prog-id=79 op=LOAD Jan 14 01:40:59.371000 audit: BPF prog-id=80 op=LOAD Jan 14 01:40:59.371000 audit: BPF prog-id=61 op=UNLOAD Jan 14 01:40:59.371000 audit: BPF prog-id=62 op=UNLOAD Jan 14 01:40:59.371000 audit: BPF prog-id=81 op=LOAD Jan 14 01:40:59.371000 audit: BPF prog-id=82 op=LOAD Jan 14 01:40:59.371000 audit: BPF prog-id=43 op=UNLOAD Jan 14 01:40:59.371000 audit: BPF prog-id=44 op=UNLOAD Jan 14 01:40:59.406592 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 14 01:40:59.406712 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 14 01:40:59.406000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 01:40:59.407170 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:40:59.407240 systemd[1]: kubelet.service: Consumed 94ms CPU time, 95.2M memory peak. Jan 14 01:40:59.408812 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:40:59.521393 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:40:59.520000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:40:59.525119 (kubelet)[2528]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 14 01:40:59.558210 kubelet[2528]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 01:40:59.558210 kubelet[2528]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 14 01:40:59.558210 kubelet[2528]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 01:40:59.558533 kubelet[2528]: I0114 01:40:59.558256 2528 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 14 01:41:00.271023 kubelet[2528]: I0114 01:41:00.270977 2528 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 14 01:41:00.271023 kubelet[2528]: I0114 01:41:00.271010 2528 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 14 01:41:00.271252 kubelet[2528]: I0114 01:41:00.271237 2528 server.go:956] "Client rotation is on, will bootstrap in background" Jan 14 01:41:00.316743 kubelet[2528]: E0114 01:41:00.315807 2528 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.35.206:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.35.206:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jan 14 01:41:00.320266 kubelet[2528]: I0114 01:41:00.320222 2528 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 14 01:41:00.330743 kubelet[2528]: I0114 01:41:00.329992 2528 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 14 01:41:00.332884 kubelet[2528]: I0114 01:41:00.332858 2528 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 14 01:41:00.337334 kubelet[2528]: I0114 01:41:00.337258 2528 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 14 01:41:00.337464 kubelet[2528]: I0114 01:41:00.337306 2528 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4578-0-0-p-96753e66ce","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 14 01:41:00.337563 kubelet[2528]: I0114 01:41:00.337538 2528 topology_manager.go:138] "Creating topology manager with none policy" Jan 14 01:41:00.337563 kubelet[2528]: I0114 01:41:00.337547 2528 container_manager_linux.go:303] "Creating device plugin manager" Jan 14 01:41:00.337828 kubelet[2528]: I0114 01:41:00.337770 2528 state_mem.go:36] "Initialized new in-memory state store" Jan 14 01:41:00.344822 kubelet[2528]: I0114 01:41:00.344748 2528 kubelet.go:480] "Attempting to sync node with API server" Jan 14 01:41:00.344822 kubelet[2528]: I0114 01:41:00.344779 2528 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 14 01:41:00.344822 kubelet[2528]: I0114 01:41:00.344804 2528 kubelet.go:386] "Adding apiserver pod source" Jan 14 01:41:00.344822 kubelet[2528]: I0114 01:41:00.344818 2528 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 14 01:41:00.346119 kubelet[2528]: I0114 01:41:00.346071 2528 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 14 01:41:00.347096 kubelet[2528]: I0114 01:41:00.347012 2528 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 14 01:41:00.347260 kubelet[2528]: W0114 01:41:00.347200 2528 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 14 01:41:00.350928 kubelet[2528]: E0114 01:41:00.350824 2528 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.35.206:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4578-0-0-p-96753e66ce&limit=500&resourceVersion=0\": dial tcp 10.0.35.206:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 14 01:41:00.350928 kubelet[2528]: E0114 01:41:00.350899 2528 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.35.206:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.35.206:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 14 01:41:00.351142 kubelet[2528]: I0114 01:41:00.351112 2528 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 14 01:41:00.351188 kubelet[2528]: I0114 01:41:00.351154 2528 server.go:1289] "Started kubelet" Jan 14 01:41:00.351824 kubelet[2528]: I0114 01:41:00.351220 2528 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 14 01:41:00.353598 kubelet[2528]: I0114 01:41:00.353515 2528 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 14 01:41:00.353986 kubelet[2528]: I0114 01:41:00.353953 2528 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 14 01:41:00.357057 kubelet[2528]: I0114 01:41:00.357028 2528 server.go:317] "Adding debug handlers to kubelet server" Jan 14 01:41:00.358651 kubelet[2528]: I0114 01:41:00.358603 2528 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 14 01:41:00.360635 kubelet[2528]: I0114 01:41:00.360577 2528 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 14 01:41:00.360896 kubelet[2528]: I0114 01:41:00.360870 2528 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 14 01:41:00.361207 kubelet[2528]: E0114 01:41:00.361044 2528 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-96753e66ce\" not found" Jan 14 01:41:00.361461 kubelet[2528]: I0114 01:41:00.360786 2528 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 14 01:41:00.361628 kubelet[2528]: I0114 01:41:00.361614 2528 reconciler.go:26] "Reconciler: start to sync state" Jan 14 01:41:00.362000 audit[2545]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2545 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:00.362000 audit[2545]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffc6266cd0 a2=0 a3=0 items=0 ppid=2528 pid=2545 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.362000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 14 01:41:00.364579 kubelet[2528]: I0114 01:41:00.364180 2528 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 14 01:41:00.365292 kubelet[2528]: E0114 01:41:00.357343 2528 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.35.206:6443/api/v1/namespaces/default/events\": dial tcp 10.0.35.206:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4578-0-0-p-96753e66ce.188a756058107698 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4578-0-0-p-96753e66ce,UID:ci-4578-0-0-p-96753e66ce,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4578-0-0-p-96753e66ce,},FirstTimestamp:2026-01-14 01:41:00.351125144 +0000 UTC m=+0.822657184,LastTimestamp:2026-01-14 01:41:00.351125144 +0000 UTC m=+0.822657184,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4578-0-0-p-96753e66ce,}" Jan 14 01:41:00.365292 kubelet[2528]: E0114 01:41:00.365064 2528 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.35.206:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4578-0-0-p-96753e66ce?timeout=10s\": dial tcp 10.0.35.206:6443: connect: connection refused" interval="200ms" Jan 14 01:41:00.365292 kubelet[2528]: E0114 01:41:00.365084 2528 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.35.206:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.35.206:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 14 01:41:00.365787 kubelet[2528]: I0114 01:41:00.365663 2528 factory.go:223] Registration of the containerd container factory successfully Jan 14 01:41:00.365787 kubelet[2528]: I0114 01:41:00.365684 2528 factory.go:223] Registration of the systemd container factory successfully Jan 14 01:41:00.364000 audit[2546]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2546 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:00.364000 audit[2546]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd364b0d0 a2=0 a3=0 items=0 ppid=2528 pid=2546 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.364000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 14 01:41:00.367000 audit[2548]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2548 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:00.367000 audit[2548]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffec1ab170 a2=0 a3=0 items=0 ppid=2528 pid=2548 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.367000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 01:41:00.369000 audit[2550]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2550 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:00.369000 audit[2550]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=fffff5318e80 a2=0 a3=0 items=0 ppid=2528 pid=2550 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.369000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 01:41:00.374000 audit[2553]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2553 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:00.374000 audit[2553]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=ffffc0a78bb0 a2=0 a3=0 items=0 ppid=2528 pid=2553 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.374000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 14 01:41:00.377162 kubelet[2528]: I0114 01:41:00.377113 2528 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 14 01:41:00.377000 audit[2555]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2555 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:00.377000 audit[2555]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=fffff728e800 a2=0 a3=0 items=0 ppid=2528 pid=2555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.377000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 14 01:41:00.378359 kubelet[2528]: I0114 01:41:00.378334 2528 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 14 01:41:00.378359 kubelet[2528]: I0114 01:41:00.378358 2528 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 14 01:41:00.378585 kubelet[2528]: I0114 01:41:00.378383 2528 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 14 01:41:00.378585 kubelet[2528]: I0114 01:41:00.378391 2528 kubelet.go:2436] "Starting kubelet main sync loop" Jan 14 01:41:00.378585 kubelet[2528]: E0114 01:41:00.378429 2528 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 14 01:41:00.377000 audit[2556]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2556 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:00.377000 audit[2556]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc157daa0 a2=0 a3=0 items=0 ppid=2528 pid=2556 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.377000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 14 01:41:00.377000 audit[2557]: NETFILTER_CFG table=mangle:49 family=10 entries=1 op=nft_register_chain pid=2557 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:00.377000 audit[2557]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe065cfd0 a2=0 a3=0 items=0 ppid=2528 pid=2557 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.377000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 14 01:41:00.379000 audit[2558]: NETFILTER_CFG table=nat:50 family=2 entries=1 op=nft_register_chain pid=2558 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:00.379000 audit[2558]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdff267f0 a2=0 a3=0 items=0 ppid=2528 pid=2558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.379000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 14 01:41:00.379000 audit[2559]: NETFILTER_CFG table=nat:51 family=10 entries=1 op=nft_register_chain pid=2559 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:00.379000 audit[2559]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc5fb5e80 a2=0 a3=0 items=0 ppid=2528 pid=2559 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.379000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 14 01:41:00.379000 audit[2560]: NETFILTER_CFG table=filter:52 family=2 entries=1 op=nft_register_chain pid=2560 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:00.379000 audit[2560]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe6b6ee80 a2=0 a3=0 items=0 ppid=2528 pid=2560 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.379000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 14 01:41:00.382129 kubelet[2528]: I0114 01:41:00.382107 2528 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 14 01:41:00.382129 kubelet[2528]: I0114 01:41:00.382126 2528 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 14 01:41:00.382264 kubelet[2528]: I0114 01:41:00.382146 2528 state_mem.go:36] "Initialized new in-memory state store" Jan 14 01:41:00.382000 audit[2565]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2565 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:00.382000 audit[2565]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe61ff030 a2=0 a3=0 items=0 ppid=2528 pid=2565 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.382000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 14 01:41:00.384224 kubelet[2528]: E0114 01:41:00.384195 2528 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.35.206:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.35.206:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 14 01:41:00.385580 kubelet[2528]: I0114 01:41:00.385556 2528 policy_none.go:49] "None policy: Start" Jan 14 01:41:00.385629 kubelet[2528]: I0114 01:41:00.385584 2528 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 14 01:41:00.385629 kubelet[2528]: I0114 01:41:00.385597 2528 state_mem.go:35] "Initializing new in-memory state store" Jan 14 01:41:00.391014 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 14 01:41:00.412893 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 14 01:41:00.415941 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 14 01:41:00.429941 kubelet[2528]: E0114 01:41:00.429916 2528 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 14 01:41:00.430236 kubelet[2528]: I0114 01:41:00.430120 2528 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 14 01:41:00.430236 kubelet[2528]: I0114 01:41:00.430138 2528 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 14 01:41:00.430380 kubelet[2528]: I0114 01:41:00.430337 2528 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 14 01:41:00.431677 kubelet[2528]: E0114 01:41:00.431650 2528 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 14 01:41:00.432019 kubelet[2528]: E0114 01:41:00.431698 2528 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4578-0-0-p-96753e66ce\" not found" Jan 14 01:41:00.490225 systemd[1]: Created slice kubepods-burstable-pod7f1234cf1ba046a8054498393ed73983.slice - libcontainer container kubepods-burstable-pod7f1234cf1ba046a8054498393ed73983.slice. Jan 14 01:41:00.503181 kubelet[2528]: E0114 01:41:00.503148 2528 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-96753e66ce\" not found" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:00.505777 systemd[1]: Created slice kubepods-burstable-pod21692cf7dfa799cca6b877e5612fce31.slice - libcontainer container kubepods-burstable-pod21692cf7dfa799cca6b877e5612fce31.slice. Jan 14 01:41:00.509466 kubelet[2528]: E0114 01:41:00.509358 2528 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-96753e66ce\" not found" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:00.510468 systemd[1]: Created slice kubepods-burstable-poddd26776522857ef53a2210b52507be51.slice - libcontainer container kubepods-burstable-poddd26776522857ef53a2210b52507be51.slice. Jan 14 01:41:00.512060 kubelet[2528]: E0114 01:41:00.512036 2528 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-96753e66ce\" not found" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:00.532555 kubelet[2528]: I0114 01:41:00.532468 2528 kubelet_node_status.go:75] "Attempting to register node" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:00.533020 kubelet[2528]: E0114 01:41:00.532989 2528 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.35.206:6443/api/v1/nodes\": dial tcp 10.0.35.206:6443: connect: connection refused" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:00.563013 kubelet[2528]: I0114 01:41:00.562979 2528 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/21692cf7dfa799cca6b877e5612fce31-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4578-0-0-p-96753e66ce\" (UID: \"21692cf7dfa799cca6b877e5612fce31\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:00.563013 kubelet[2528]: I0114 01:41:00.563017 2528 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7f1234cf1ba046a8054498393ed73983-ca-certs\") pod \"kube-apiserver-ci-4578-0-0-p-96753e66ce\" (UID: \"7f1234cf1ba046a8054498393ed73983\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:00.563332 kubelet[2528]: I0114 01:41:00.563035 2528 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/21692cf7dfa799cca6b877e5612fce31-ca-certs\") pod \"kube-controller-manager-ci-4578-0-0-p-96753e66ce\" (UID: \"21692cf7dfa799cca6b877e5612fce31\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:00.563332 kubelet[2528]: I0114 01:41:00.563054 2528 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/21692cf7dfa799cca6b877e5612fce31-k8s-certs\") pod \"kube-controller-manager-ci-4578-0-0-p-96753e66ce\" (UID: \"21692cf7dfa799cca6b877e5612fce31\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:00.563332 kubelet[2528]: I0114 01:41:00.563070 2528 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/dd26776522857ef53a2210b52507be51-kubeconfig\") pod \"kube-scheduler-ci-4578-0-0-p-96753e66ce\" (UID: \"dd26776522857ef53a2210b52507be51\") " pod="kube-system/kube-scheduler-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:00.563332 kubelet[2528]: I0114 01:41:00.563085 2528 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7f1234cf1ba046a8054498393ed73983-k8s-certs\") pod \"kube-apiserver-ci-4578-0-0-p-96753e66ce\" (UID: \"7f1234cf1ba046a8054498393ed73983\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:00.563332 kubelet[2528]: I0114 01:41:00.563115 2528 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7f1234cf1ba046a8054498393ed73983-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4578-0-0-p-96753e66ce\" (UID: \"7f1234cf1ba046a8054498393ed73983\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:00.563441 kubelet[2528]: I0114 01:41:00.563151 2528 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/21692cf7dfa799cca6b877e5612fce31-flexvolume-dir\") pod \"kube-controller-manager-ci-4578-0-0-p-96753e66ce\" (UID: \"21692cf7dfa799cca6b877e5612fce31\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:00.563441 kubelet[2528]: I0114 01:41:00.563170 2528 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/21692cf7dfa799cca6b877e5612fce31-kubeconfig\") pod \"kube-controller-manager-ci-4578-0-0-p-96753e66ce\" (UID: \"21692cf7dfa799cca6b877e5612fce31\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:00.566516 kubelet[2528]: E0114 01:41:00.566486 2528 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.35.206:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4578-0-0-p-96753e66ce?timeout=10s\": dial tcp 10.0.35.206:6443: connect: connection refused" interval="400ms" Jan 14 01:41:00.736090 kubelet[2528]: I0114 01:41:00.735839 2528 kubelet_node_status.go:75] "Attempting to register node" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:00.736195 kubelet[2528]: E0114 01:41:00.736159 2528 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.35.206:6443/api/v1/nodes\": dial tcp 10.0.35.206:6443: connect: connection refused" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:00.804943 containerd[1676]: time="2026-01-14T01:41:00.804845882Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4578-0-0-p-96753e66ce,Uid:7f1234cf1ba046a8054498393ed73983,Namespace:kube-system,Attempt:0,}" Jan 14 01:41:00.810598 containerd[1676]: time="2026-01-14T01:41:00.810566097Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4578-0-0-p-96753e66ce,Uid:21692cf7dfa799cca6b877e5612fce31,Namespace:kube-system,Attempt:0,}" Jan 14 01:41:00.813325 containerd[1676]: time="2026-01-14T01:41:00.813235263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4578-0-0-p-96753e66ce,Uid:dd26776522857ef53a2210b52507be51,Namespace:kube-system,Attempt:0,}" Jan 14 01:41:00.836920 containerd[1676]: time="2026-01-14T01:41:00.836873643Z" level=info msg="connecting to shim 6496a834bd0f0c72870da984b8bee4798c8e09ac6181cd10620d39a9a398319f" address="unix:///run/containerd/s/7e5eed565817805cda3aa82b9005cc88ccdbabd99d4b10e16e9cb32d8a0b0891" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:41:00.850954 containerd[1676]: time="2026-01-14T01:41:00.850774318Z" level=info msg="connecting to shim 9ec47dfaa26e672adb481f36a5b1308fff991f9cbc0940c99e5398bb264d5293" address="unix:///run/containerd/s/9823c418787b7867489d8c9262c14b4a6853a297804addbb33cf3fda1c1f7584" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:41:00.851612 containerd[1676]: time="2026-01-14T01:41:00.851589800Z" level=info msg="connecting to shim 169ccaf4eb92b1321042234eed861ab0b1a7c692193a1e178e62955842d3a3cd" address="unix:///run/containerd/s/a3fd9a60785d65bc1160d748ff9292b3e776e5ff29585cce16ccef6e3eb2f1af" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:41:00.871940 systemd[1]: Started cri-containerd-6496a834bd0f0c72870da984b8bee4798c8e09ac6181cd10620d39a9a398319f.scope - libcontainer container 6496a834bd0f0c72870da984b8bee4798c8e09ac6181cd10620d39a9a398319f. Jan 14 01:41:00.876492 systemd[1]: Started cri-containerd-169ccaf4eb92b1321042234eed861ab0b1a7c692193a1e178e62955842d3a3cd.scope - libcontainer container 169ccaf4eb92b1321042234eed861ab0b1a7c692193a1e178e62955842d3a3cd. Jan 14 01:41:00.878160 systemd[1]: Started cri-containerd-9ec47dfaa26e672adb481f36a5b1308fff991f9cbc0940c99e5398bb264d5293.scope - libcontainer container 9ec47dfaa26e672adb481f36a5b1308fff991f9cbc0940c99e5398bb264d5293. Jan 14 01:41:00.887651 kernel: kauditd_printk_skb: 72 callbacks suppressed Jan 14 01:41:00.887743 kernel: audit: type=1334 audit(1768354860.883:348): prog-id=83 op=LOAD Jan 14 01:41:00.883000 audit: BPF prog-id=83 op=LOAD Jan 14 01:41:00.887000 audit: BPF prog-id=84 op=LOAD Jan 14 01:41:00.889672 kernel: audit: type=1334 audit(1768354860.887:349): prog-id=84 op=LOAD Jan 14 01:41:00.889751 kernel: audit: type=1300 audit(1768354860.887:349): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2575 pid=2611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.887000 audit[2611]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2575 pid=2611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.893775 kernel: audit: type=1327 audit(1768354860.887:349): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634393661383334626430663063373238373064613938346238626565 Jan 14 01:41:00.887000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634393661383334626430663063373238373064613938346238626565 Jan 14 01:41:00.887000 audit: BPF prog-id=84 op=UNLOAD Jan 14 01:41:00.898633 kernel: audit: type=1334 audit(1768354860.887:350): prog-id=84 op=UNLOAD Jan 14 01:41:00.887000 audit[2611]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2575 pid=2611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.902884 kernel: audit: type=1300 audit(1768354860.887:350): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2575 pid=2611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.902984 kernel: audit: type=1327 audit(1768354860.887:350): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634393661383334626430663063373238373064613938346238626565 Jan 14 01:41:00.887000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634393661383334626430663063373238373064613938346238626565 Jan 14 01:41:00.887000 audit: BPF prog-id=85 op=LOAD Jan 14 01:41:00.907901 kernel: audit: type=1334 audit(1768354860.887:351): prog-id=85 op=LOAD Jan 14 01:41:00.887000 audit: BPF prog-id=86 op=LOAD Jan 14 01:41:00.908926 kernel: audit: audit_backlog=65 > audit_backlog_limit=64 Jan 14 01:41:00.908967 kernel: audit: type=1334 audit(1768354860.887:352): prog-id=86 op=LOAD Jan 14 01:41:00.887000 audit[2611]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2575 pid=2611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.887000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634393661383334626430663063373238373064613938346238626565 Jan 14 01:41:00.895000 audit: BPF prog-id=87 op=LOAD Jan 14 01:41:00.892000 audit: BPF prog-id=88 op=LOAD Jan 14 01:41:00.892000 audit[2611]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2575 pid=2611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.892000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634393661383334626430663063373238373064613938346238626565 Jan 14 01:41:00.895000 audit: BPF prog-id=88 op=UNLOAD Jan 14 01:41:00.895000 audit[2611]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2575 pid=2611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.895000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634393661383334626430663063373238373064613938346238626565 Jan 14 01:41:00.895000 audit: BPF prog-id=86 op=UNLOAD Jan 14 01:41:00.895000 audit[2611]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2575 pid=2611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.895000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634393661383334626430663063373238373064613938346238626565 Jan 14 01:41:00.892000 audit: BPF prog-id=89 op=LOAD Jan 14 01:41:00.896000 audit: BPF prog-id=90 op=LOAD Jan 14 01:41:00.896000 audit[2611]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2575 pid=2611 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.896000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634393661383334626430663063373238373064613938346238626565 Jan 14 01:41:00.892000 audit[2636]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2604 pid=2636 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.892000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136396363616634656239326231333231303432323334656564383631 Jan 14 01:41:00.896000 audit: BPF prog-id=89 op=UNLOAD Jan 14 01:41:00.896000 audit[2636]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2604 pid=2636 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.896000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136396363616634656239326231333231303432323334656564383631 Jan 14 01:41:00.897000 audit: BPF prog-id=91 op=LOAD Jan 14 01:41:00.897000 audit[2636]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2604 pid=2636 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136396363616634656239326231333231303432323334656564383631 Jan 14 01:41:00.897000 audit: BPF prog-id=92 op=LOAD Jan 14 01:41:00.897000 audit[2628]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=2599 pid=2628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965633437646661613236653637326164623438316633366135623133 Jan 14 01:41:00.897000 audit: BPF prog-id=92 op=UNLOAD Jan 14 01:41:00.897000 audit: BPF prog-id=93 op=LOAD Jan 14 01:41:00.897000 audit[2628]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.897000 audit[2636]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2604 pid=2636 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965633437646661613236653637326164623438316633366135623133 Jan 14 01:41:00.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136396363616634656239326231333231303432323334656564383631 Jan 14 01:41:00.897000 audit: BPF prog-id=93 op=UNLOAD Jan 14 01:41:00.897000 audit[2636]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2604 pid=2636 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136396363616634656239326231333231303432323334656564383631 Jan 14 01:41:00.897000 audit: BPF prog-id=91 op=UNLOAD Jan 14 01:41:00.897000 audit[2636]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2604 pid=2636 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136396363616634656239326231333231303432323334656564383631 Jan 14 01:41:00.897000 audit: BPF prog-id=94 op=LOAD Jan 14 01:41:00.897000 audit[2636]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2604 pid=2636 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.897000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3136396363616634656239326231333231303432323334656564383631 Jan 14 01:41:00.901000 audit: BPF prog-id=95 op=LOAD Jan 14 01:41:00.901000 audit[2628]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=2599 pid=2628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.901000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965633437646661613236653637326164623438316633366135623133 Jan 14 01:41:00.905000 audit: BPF prog-id=96 op=LOAD Jan 14 01:41:00.905000 audit[2628]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=2599 pid=2628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.905000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965633437646661613236653637326164623438316633366135623133 Jan 14 01:41:00.907000 audit: BPF prog-id=95 op=UNLOAD Jan 14 01:41:00.907000 audit[2628]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.907000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965633437646661613236653637326164623438316633366135623133 Jan 14 01:41:00.911000 audit: BPF prog-id=97 op=LOAD Jan 14 01:41:00.911000 audit[2628]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=2599 pid=2628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:00.911000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3965633437646661613236653637326164623438316633366135623133 Jan 14 01:41:00.936745 containerd[1676]: time="2026-01-14T01:41:00.936675413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4578-0-0-p-96753e66ce,Uid:dd26776522857ef53a2210b52507be51,Namespace:kube-system,Attempt:0,} returns sandbox id \"169ccaf4eb92b1321042234eed861ab0b1a7c692193a1e178e62955842d3a3cd\"" Jan 14 01:41:00.945078 containerd[1676]: time="2026-01-14T01:41:00.945041074Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4578-0-0-p-96753e66ce,Uid:21692cf7dfa799cca6b877e5612fce31,Namespace:kube-system,Attempt:0,} returns sandbox id \"9ec47dfaa26e672adb481f36a5b1308fff991f9cbc0940c99e5398bb264d5293\"" Jan 14 01:41:00.948976 containerd[1676]: time="2026-01-14T01:41:00.948878644Z" level=info msg="CreateContainer within sandbox \"169ccaf4eb92b1321042234eed861ab0b1a7c692193a1e178e62955842d3a3cd\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 14 01:41:00.949492 containerd[1676]: time="2026-01-14T01:41:00.949457045Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4578-0-0-p-96753e66ce,Uid:7f1234cf1ba046a8054498393ed73983,Namespace:kube-system,Attempt:0,} returns sandbox id \"6496a834bd0f0c72870da984b8bee4798c8e09ac6181cd10620d39a9a398319f\"" Jan 14 01:41:00.951597 containerd[1676]: time="2026-01-14T01:41:00.951024729Z" level=info msg="CreateContainer within sandbox \"9ec47dfaa26e672adb481f36a5b1308fff991f9cbc0940c99e5398bb264d5293\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 14 01:41:00.954498 containerd[1676]: time="2026-01-14T01:41:00.954467698Z" level=info msg="CreateContainer within sandbox \"6496a834bd0f0c72870da984b8bee4798c8e09ac6181cd10620d39a9a398319f\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 14 01:41:00.958658 containerd[1676]: time="2026-01-14T01:41:00.958623068Z" level=info msg="Container 4ef4cffedf687844368f629c4f4c88098c02e6880c283f9a4d77dad7ebaa3749: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:41:00.967185 kubelet[2528]: E0114 01:41:00.967121 2528 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.35.206:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4578-0-0-p-96753e66ce?timeout=10s\": dial tcp 10.0.35.206:6443: connect: connection refused" interval="800ms" Jan 14 01:41:00.967370 containerd[1676]: time="2026-01-14T01:41:00.967333690Z" level=info msg="Container 435a536ddb1c0891d0028b96ab5116258cd5b16a5fd47ce952a83eea92b1f4ab: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:41:00.968592 containerd[1676]: time="2026-01-14T01:41:00.968549413Z" level=info msg="CreateContainer within sandbox \"169ccaf4eb92b1321042234eed861ab0b1a7c692193a1e178e62955842d3a3cd\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"4ef4cffedf687844368f629c4f4c88098c02e6880c283f9a4d77dad7ebaa3749\"" Jan 14 01:41:00.969241 containerd[1676]: time="2026-01-14T01:41:00.969207855Z" level=info msg="StartContainer for \"4ef4cffedf687844368f629c4f4c88098c02e6880c283f9a4d77dad7ebaa3749\"" Jan 14 01:41:00.970409 containerd[1676]: time="2026-01-14T01:41:00.970164857Z" level=info msg="Container 55bcad733ab2ac5f313280cc614dc313a2ded09ac66d373503b1e89f0075fb4f: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:41:00.970477 containerd[1676]: time="2026-01-14T01:41:00.970422498Z" level=info msg="connecting to shim 4ef4cffedf687844368f629c4f4c88098c02e6880c283f9a4d77dad7ebaa3749" address="unix:///run/containerd/s/a3fd9a60785d65bc1160d748ff9292b3e776e5ff29585cce16ccef6e3eb2f1af" protocol=ttrpc version=3 Jan 14 01:41:00.978170 containerd[1676]: time="2026-01-14T01:41:00.978129437Z" level=info msg="CreateContainer within sandbox \"9ec47dfaa26e672adb481f36a5b1308fff991f9cbc0940c99e5398bb264d5293\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"435a536ddb1c0891d0028b96ab5116258cd5b16a5fd47ce952a83eea92b1f4ab\"" Jan 14 01:41:00.979506 containerd[1676]: time="2026-01-14T01:41:00.979474600Z" level=info msg="StartContainer for \"435a536ddb1c0891d0028b96ab5116258cd5b16a5fd47ce952a83eea92b1f4ab\"" Jan 14 01:41:00.980585 containerd[1676]: time="2026-01-14T01:41:00.980557723Z" level=info msg="connecting to shim 435a536ddb1c0891d0028b96ab5116258cd5b16a5fd47ce952a83eea92b1f4ab" address="unix:///run/containerd/s/9823c418787b7867489d8c9262c14b4a6853a297804addbb33cf3fda1c1f7584" protocol=ttrpc version=3 Jan 14 01:41:00.984606 containerd[1676]: time="2026-01-14T01:41:00.984555213Z" level=info msg="CreateContainer within sandbox \"6496a834bd0f0c72870da984b8bee4798c8e09ac6181cd10620d39a9a398319f\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"55bcad733ab2ac5f313280cc614dc313a2ded09ac66d373503b1e89f0075fb4f\"" Jan 14 01:41:00.985008 containerd[1676]: time="2026-01-14T01:41:00.984986614Z" level=info msg="StartContainer for \"55bcad733ab2ac5f313280cc614dc313a2ded09ac66d373503b1e89f0075fb4f\"" Jan 14 01:41:00.986118 containerd[1676]: time="2026-01-14T01:41:00.986089937Z" level=info msg="connecting to shim 55bcad733ab2ac5f313280cc614dc313a2ded09ac66d373503b1e89f0075fb4f" address="unix:///run/containerd/s/7e5eed565817805cda3aa82b9005cc88ccdbabd99d4b10e16e9cb32d8a0b0891" protocol=ttrpc version=3 Jan 14 01:41:00.988971 systemd[1]: Started cri-containerd-4ef4cffedf687844368f629c4f4c88098c02e6880c283f9a4d77dad7ebaa3749.scope - libcontainer container 4ef4cffedf687844368f629c4f4c88098c02e6880c283f9a4d77dad7ebaa3749. Jan 14 01:41:01.001938 systemd[1]: Started cri-containerd-435a536ddb1c0891d0028b96ab5116258cd5b16a5fd47ce952a83eea92b1f4ab.scope - libcontainer container 435a536ddb1c0891d0028b96ab5116258cd5b16a5fd47ce952a83eea92b1f4ab. Jan 14 01:41:01.005586 systemd[1]: Started cri-containerd-55bcad733ab2ac5f313280cc614dc313a2ded09ac66d373503b1e89f0075fb4f.scope - libcontainer container 55bcad733ab2ac5f313280cc614dc313a2ded09ac66d373503b1e89f0075fb4f. Jan 14 01:41:01.005000 audit: BPF prog-id=98 op=LOAD Jan 14 01:41:01.006000 audit: BPF prog-id=99 op=LOAD Jan 14 01:41:01.006000 audit[2707]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=2604 pid=2707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.006000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465663463666665646636383738343433363866363239633466346338 Jan 14 01:41:01.006000 audit: BPF prog-id=99 op=UNLOAD Jan 14 01:41:01.006000 audit[2707]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2604 pid=2707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.006000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465663463666665646636383738343433363866363239633466346338 Jan 14 01:41:01.006000 audit: BPF prog-id=100 op=LOAD Jan 14 01:41:01.006000 audit[2707]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=2604 pid=2707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.006000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465663463666665646636383738343433363866363239633466346338 Jan 14 01:41:01.006000 audit: BPF prog-id=101 op=LOAD Jan 14 01:41:01.006000 audit[2707]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=2604 pid=2707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.006000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465663463666665646636383738343433363866363239633466346338 Jan 14 01:41:01.006000 audit: BPF prog-id=101 op=UNLOAD Jan 14 01:41:01.006000 audit[2707]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2604 pid=2707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.006000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465663463666665646636383738343433363866363239633466346338 Jan 14 01:41:01.006000 audit: BPF prog-id=100 op=UNLOAD Jan 14 01:41:01.006000 audit[2707]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2604 pid=2707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.006000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465663463666665646636383738343433363866363239633466346338 Jan 14 01:41:01.006000 audit: BPF prog-id=102 op=LOAD Jan 14 01:41:01.006000 audit[2707]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=2604 pid=2707 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.006000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3465663463666665646636383738343433363866363239633466346338 Jan 14 01:41:01.017000 audit: BPF prog-id=103 op=LOAD Jan 14 01:41:01.017000 audit: BPF prog-id=104 op=LOAD Jan 14 01:41:01.017000 audit[2720]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2599 pid=2720 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.017000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3433356135333664646231633038393164303032386239366162353131 Jan 14 01:41:01.018000 audit: BPF prog-id=104 op=UNLOAD Jan 14 01:41:01.018000 audit[2720]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2720 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.018000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3433356135333664646231633038393164303032386239366162353131 Jan 14 01:41:01.018000 audit: BPF prog-id=105 op=LOAD Jan 14 01:41:01.018000 audit[2720]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2599 pid=2720 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.018000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3433356135333664646231633038393164303032386239366162353131 Jan 14 01:41:01.018000 audit: BPF prog-id=106 op=LOAD Jan 14 01:41:01.018000 audit[2720]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2599 pid=2720 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.018000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3433356135333664646231633038393164303032386239366162353131 Jan 14 01:41:01.018000 audit: BPF prog-id=106 op=UNLOAD Jan 14 01:41:01.018000 audit[2720]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2720 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.018000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3433356135333664646231633038393164303032386239366162353131 Jan 14 01:41:01.018000 audit: BPF prog-id=105 op=UNLOAD Jan 14 01:41:01.018000 audit[2720]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=2720 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.018000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3433356135333664646231633038393164303032386239366162353131 Jan 14 01:41:01.018000 audit: BPF prog-id=107 op=LOAD Jan 14 01:41:01.018000 audit[2720]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2599 pid=2720 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.018000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3433356135333664646231633038393164303032386239366162353131 Jan 14 01:41:01.020000 audit: BPF prog-id=108 op=LOAD Jan 14 01:41:01.020000 audit: BPF prog-id=109 op=LOAD Jan 14 01:41:01.020000 audit[2726]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2575 pid=2726 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.020000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535626361643733336162326163356633313332383063633631346463 Jan 14 01:41:01.020000 audit: BPF prog-id=109 op=UNLOAD Jan 14 01:41:01.020000 audit[2726]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2575 pid=2726 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.020000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535626361643733336162326163356633313332383063633631346463 Jan 14 01:41:01.021000 audit: BPF prog-id=110 op=LOAD Jan 14 01:41:01.021000 audit[2726]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2575 pid=2726 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.021000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535626361643733336162326163356633313332383063633631346463 Jan 14 01:41:01.021000 audit: BPF prog-id=111 op=LOAD Jan 14 01:41:01.021000 audit[2726]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2575 pid=2726 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.021000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535626361643733336162326163356633313332383063633631346463 Jan 14 01:41:01.021000 audit: BPF prog-id=111 op=UNLOAD Jan 14 01:41:01.021000 audit[2726]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2575 pid=2726 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.021000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535626361643733336162326163356633313332383063633631346463 Jan 14 01:41:01.021000 audit: BPF prog-id=110 op=UNLOAD Jan 14 01:41:01.021000 audit[2726]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2575 pid=2726 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.021000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535626361643733336162326163356633313332383063633631346463 Jan 14 01:41:01.021000 audit: BPF prog-id=112 op=LOAD Jan 14 01:41:01.021000 audit[2726]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2575 pid=2726 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:01.021000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3535626361643733336162326163356633313332383063633631346463 Jan 14 01:41:01.049575 containerd[1676]: time="2026-01-14T01:41:01.049531176Z" level=info msg="StartContainer for \"4ef4cffedf687844368f629c4f4c88098c02e6880c283f9a4d77dad7ebaa3749\" returns successfully" Jan 14 01:41:01.057962 containerd[1676]: time="2026-01-14T01:41:01.057854077Z" level=info msg="StartContainer for \"55bcad733ab2ac5f313280cc614dc313a2ded09ac66d373503b1e89f0075fb4f\" returns successfully" Jan 14 01:41:01.062039 containerd[1676]: time="2026-01-14T01:41:01.062002327Z" level=info msg="StartContainer for \"435a536ddb1c0891d0028b96ab5116258cd5b16a5fd47ce952a83eea92b1f4ab\" returns successfully" Jan 14 01:41:01.138647 kubelet[2528]: I0114 01:41:01.138378 2528 kubelet_node_status.go:75] "Attempting to register node" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:01.138762 kubelet[2528]: E0114 01:41:01.138735 2528 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.35.206:6443/api/v1/nodes\": dial tcp 10.0.35.206:6443: connect: connection refused" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:01.386977 kubelet[2528]: E0114 01:41:01.386881 2528 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-96753e66ce\" not found" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:01.392569 kubelet[2528]: E0114 01:41:01.392540 2528 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-96753e66ce\" not found" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:01.393797 kubelet[2528]: E0114 01:41:01.393779 2528 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-96753e66ce\" not found" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:01.940875 kubelet[2528]: I0114 01:41:01.940842 2528 kubelet_node_status.go:75] "Attempting to register node" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:02.400741 kubelet[2528]: E0114 01:41:02.400285 2528 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-96753e66ce\" not found" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:02.400741 kubelet[2528]: E0114 01:41:02.400373 2528 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4578-0-0-p-96753e66ce\" not found" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:02.449182 kubelet[2528]: E0114 01:41:02.449121 2528 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4578-0-0-p-96753e66ce\" not found" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:02.606740 kubelet[2528]: I0114 01:41:02.606548 2528 kubelet_node_status.go:78] "Successfully registered node" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:02.662360 kubelet[2528]: I0114 01:41:02.661391 2528 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:02.667222 kubelet[2528]: E0114 01:41:02.667177 2528 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4578-0-0-p-96753e66ce\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:02.667222 kubelet[2528]: I0114 01:41:02.667210 2528 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:02.669166 kubelet[2528]: E0114 01:41:02.668962 2528 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4578-0-0-p-96753e66ce\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:02.669166 kubelet[2528]: I0114 01:41:02.668991 2528 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:02.670497 kubelet[2528]: E0114 01:41:02.670464 2528 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4578-0-0-p-96753e66ce\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:03.347045 kubelet[2528]: I0114 01:41:03.346828 2528 apiserver.go:52] "Watching apiserver" Jan 14 01:41:03.361056 kubelet[2528]: I0114 01:41:03.361005 2528 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 14 01:41:04.641260 systemd[1]: Reload requested from client PID 2814 ('systemctl') (unit session-10.scope)... Jan 14 01:41:04.641278 systemd[1]: Reloading... Jan 14 01:41:04.719780 zram_generator::config[2860]: No configuration found. Jan 14 01:41:04.905551 systemd[1]: Reloading finished in 264 ms. Jan 14 01:41:04.933915 kubelet[2528]: I0114 01:41:04.933827 2528 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 14 01:41:04.933992 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:41:04.943925 systemd[1]: kubelet.service: Deactivated successfully. Jan 14 01:41:04.944206 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:41:04.942000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:41:04.944272 systemd[1]: kubelet.service: Consumed 1.168s CPU time, 128.7M memory peak. Jan 14 01:41:04.945000 audit: BPF prog-id=113 op=LOAD Jan 14 01:41:04.945000 audit: BPF prog-id=75 op=UNLOAD Jan 14 01:41:04.945000 audit: BPF prog-id=114 op=LOAD Jan 14 01:41:04.945000 audit: BPF prog-id=115 op=LOAD Jan 14 01:41:04.945000 audit: BPF prog-id=76 op=UNLOAD Jan 14 01:41:04.945000 audit: BPF prog-id=77 op=UNLOAD Jan 14 01:41:04.946975 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 01:41:04.946000 audit: BPF prog-id=116 op=LOAD Jan 14 01:41:04.946000 audit: BPF prog-id=69 op=UNLOAD Jan 14 01:41:04.947000 audit: BPF prog-id=117 op=LOAD Jan 14 01:41:04.947000 audit: BPF prog-id=78 op=UNLOAD Jan 14 01:41:04.947000 audit: BPF prog-id=118 op=LOAD Jan 14 01:41:04.947000 audit: BPF prog-id=119 op=LOAD Jan 14 01:41:04.947000 audit: BPF prog-id=79 op=UNLOAD Jan 14 01:41:04.947000 audit: BPF prog-id=80 op=UNLOAD Jan 14 01:41:04.948000 audit: BPF prog-id=120 op=LOAD Jan 14 01:41:04.948000 audit: BPF prog-id=70 op=UNLOAD Jan 14 01:41:04.949000 audit: BPF prog-id=121 op=LOAD Jan 14 01:41:04.949000 audit: BPF prog-id=63 op=UNLOAD Jan 14 01:41:04.949000 audit: BPF prog-id=122 op=LOAD Jan 14 01:41:04.949000 audit: BPF prog-id=123 op=LOAD Jan 14 01:41:04.969000 audit: BPF prog-id=64 op=UNLOAD Jan 14 01:41:04.969000 audit: BPF prog-id=65 op=UNLOAD Jan 14 01:41:04.969000 audit: BPF prog-id=124 op=LOAD Jan 14 01:41:04.969000 audit: BPF prog-id=74 op=UNLOAD Jan 14 01:41:04.970000 audit: BPF prog-id=125 op=LOAD Jan 14 01:41:04.970000 audit: BPF prog-id=126 op=LOAD Jan 14 01:41:04.970000 audit: BPF prog-id=81 op=UNLOAD Jan 14 01:41:04.970000 audit: BPF prog-id=82 op=UNLOAD Jan 14 01:41:04.971000 audit: BPF prog-id=127 op=LOAD Jan 14 01:41:04.971000 audit: BPF prog-id=66 op=UNLOAD Jan 14 01:41:04.971000 audit: BPF prog-id=128 op=LOAD Jan 14 01:41:04.971000 audit: BPF prog-id=129 op=LOAD Jan 14 01:41:04.971000 audit: BPF prog-id=67 op=UNLOAD Jan 14 01:41:04.971000 audit: BPF prog-id=68 op=UNLOAD Jan 14 01:41:04.972000 audit: BPF prog-id=130 op=LOAD Jan 14 01:41:04.972000 audit: BPF prog-id=71 op=UNLOAD Jan 14 01:41:04.972000 audit: BPF prog-id=131 op=LOAD Jan 14 01:41:04.972000 audit: BPF prog-id=132 op=LOAD Jan 14 01:41:04.972000 audit: BPF prog-id=72 op=UNLOAD Jan 14 01:41:04.972000 audit: BPF prog-id=73 op=UNLOAD Jan 14 01:41:05.115888 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 01:41:05.114000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:41:05.122358 (kubelet)[2905]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 14 01:41:05.162485 kubelet[2905]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 01:41:05.162485 kubelet[2905]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 14 01:41:05.162485 kubelet[2905]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 01:41:05.162485 kubelet[2905]: I0114 01:41:05.162117 2905 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 14 01:41:05.168808 kubelet[2905]: I0114 01:41:05.168775 2905 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 14 01:41:05.168808 kubelet[2905]: I0114 01:41:05.168804 2905 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 14 01:41:05.169068 kubelet[2905]: I0114 01:41:05.169048 2905 server.go:956] "Client rotation is on, will bootstrap in background" Jan 14 01:41:05.170281 kubelet[2905]: I0114 01:41:05.170256 2905 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jan 14 01:41:05.173120 kubelet[2905]: I0114 01:41:05.173089 2905 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 14 01:41:05.177312 kubelet[2905]: I0114 01:41:05.177287 2905 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 14 01:41:05.180830 kubelet[2905]: I0114 01:41:05.180789 2905 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 14 01:41:05.181299 kubelet[2905]: I0114 01:41:05.181189 2905 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 14 01:41:05.181541 kubelet[2905]: I0114 01:41:05.181262 2905 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4578-0-0-p-96753e66ce","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 14 01:41:05.181623 kubelet[2905]: I0114 01:41:05.181549 2905 topology_manager.go:138] "Creating topology manager with none policy" Jan 14 01:41:05.181623 kubelet[2905]: I0114 01:41:05.181557 2905 container_manager_linux.go:303] "Creating device plugin manager" Jan 14 01:41:05.181623 kubelet[2905]: I0114 01:41:05.181598 2905 state_mem.go:36] "Initialized new in-memory state store" Jan 14 01:41:05.181794 kubelet[2905]: I0114 01:41:05.181781 2905 kubelet.go:480] "Attempting to sync node with API server" Jan 14 01:41:05.181829 kubelet[2905]: I0114 01:41:05.181798 2905 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 14 01:41:05.181829 kubelet[2905]: I0114 01:41:05.181823 2905 kubelet.go:386] "Adding apiserver pod source" Jan 14 01:41:05.181879 kubelet[2905]: I0114 01:41:05.181835 2905 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 14 01:41:05.182971 kubelet[2905]: I0114 01:41:05.182950 2905 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 14 01:41:05.187998 kubelet[2905]: I0114 01:41:05.185831 2905 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 14 01:41:05.190052 kubelet[2905]: I0114 01:41:05.190024 2905 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 14 01:41:05.190179 kubelet[2905]: I0114 01:41:05.190168 2905 server.go:1289] "Started kubelet" Jan 14 01:41:05.190399 kubelet[2905]: I0114 01:41:05.190330 2905 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 14 01:41:05.193773 kubelet[2905]: I0114 01:41:05.192942 2905 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 14 01:41:05.193773 kubelet[2905]: I0114 01:41:05.193010 2905 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 14 01:41:05.195778 kubelet[2905]: I0114 01:41:05.194085 2905 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 14 01:41:05.199940 kubelet[2905]: I0114 01:41:05.199891 2905 server.go:317] "Adding debug handlers to kubelet server" Jan 14 01:41:05.206060 kubelet[2905]: I0114 01:41:05.203967 2905 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 14 01:41:05.212700 kubelet[2905]: I0114 01:41:05.212603 2905 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 14 01:41:05.214852 kubelet[2905]: I0114 01:41:05.212844 2905 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 14 01:41:05.214852 kubelet[2905]: E0114 01:41:05.214112 2905 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4578-0-0-p-96753e66ce\" not found" Jan 14 01:41:05.215767 kubelet[2905]: I0114 01:41:05.215002 2905 reconciler.go:26] "Reconciler: start to sync state" Jan 14 01:41:05.222727 kubelet[2905]: E0114 01:41:05.221862 2905 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 14 01:41:05.225089 kubelet[2905]: I0114 01:41:05.225034 2905 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 14 01:41:05.226536 kubelet[2905]: I0114 01:41:05.226478 2905 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 14 01:41:05.227341 kubelet[2905]: I0114 01:41:05.227315 2905 factory.go:223] Registration of the containerd container factory successfully Jan 14 01:41:05.227341 kubelet[2905]: I0114 01:41:05.227335 2905 factory.go:223] Registration of the systemd container factory successfully Jan 14 01:41:05.228138 kubelet[2905]: I0114 01:41:05.228109 2905 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 14 01:41:05.228138 kubelet[2905]: I0114 01:41:05.228131 2905 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 14 01:41:05.228231 kubelet[2905]: I0114 01:41:05.228148 2905 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 14 01:41:05.228231 kubelet[2905]: I0114 01:41:05.228155 2905 kubelet.go:2436] "Starting kubelet main sync loop" Jan 14 01:41:05.228231 kubelet[2905]: E0114 01:41:05.228190 2905 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 14 01:41:05.262686 kubelet[2905]: I0114 01:41:05.262370 2905 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 14 01:41:05.262686 kubelet[2905]: I0114 01:41:05.262390 2905 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 14 01:41:05.262686 kubelet[2905]: I0114 01:41:05.262412 2905 state_mem.go:36] "Initialized new in-memory state store" Jan 14 01:41:05.262686 kubelet[2905]: I0114 01:41:05.262558 2905 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 14 01:41:05.262686 kubelet[2905]: I0114 01:41:05.262568 2905 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 14 01:41:05.262686 kubelet[2905]: I0114 01:41:05.262583 2905 policy_none.go:49] "None policy: Start" Jan 14 01:41:05.262686 kubelet[2905]: I0114 01:41:05.262592 2905 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 14 01:41:05.262686 kubelet[2905]: I0114 01:41:05.262600 2905 state_mem.go:35] "Initializing new in-memory state store" Jan 14 01:41:05.263066 kubelet[2905]: I0114 01:41:05.263051 2905 state_mem.go:75] "Updated machine memory state" Jan 14 01:41:05.266584 kubelet[2905]: E0114 01:41:05.266555 2905 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 14 01:41:05.266859 kubelet[2905]: I0114 01:41:05.266714 2905 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 14 01:41:05.266935 kubelet[2905]: I0114 01:41:05.266861 2905 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 14 01:41:05.267271 kubelet[2905]: I0114 01:41:05.267063 2905 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 14 01:41:05.268924 kubelet[2905]: E0114 01:41:05.268903 2905 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 14 01:41:05.329415 kubelet[2905]: I0114 01:41:05.329375 2905 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:05.329571 kubelet[2905]: I0114 01:41:05.329387 2905 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:05.332136 kubelet[2905]: I0114 01:41:05.330841 2905 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:05.369675 kubelet[2905]: I0114 01:41:05.369648 2905 kubelet_node_status.go:75] "Attempting to register node" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:05.377134 kubelet[2905]: I0114 01:41:05.377094 2905 kubelet_node_status.go:124] "Node was previously registered" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:05.377249 kubelet[2905]: I0114 01:41:05.377181 2905 kubelet_node_status.go:78] "Successfully registered node" node="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:05.515885 kubelet[2905]: I0114 01:41:05.515761 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7f1234cf1ba046a8054498393ed73983-ca-certs\") pod \"kube-apiserver-ci-4578-0-0-p-96753e66ce\" (UID: \"7f1234cf1ba046a8054498393ed73983\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:05.515885 kubelet[2905]: I0114 01:41:05.515802 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7f1234cf1ba046a8054498393ed73983-k8s-certs\") pod \"kube-apiserver-ci-4578-0-0-p-96753e66ce\" (UID: \"7f1234cf1ba046a8054498393ed73983\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:05.515885 kubelet[2905]: I0114 01:41:05.515836 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7f1234cf1ba046a8054498393ed73983-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4578-0-0-p-96753e66ce\" (UID: \"7f1234cf1ba046a8054498393ed73983\") " pod="kube-system/kube-apiserver-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:05.515885 kubelet[2905]: I0114 01:41:05.515852 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/21692cf7dfa799cca6b877e5612fce31-ca-certs\") pod \"kube-controller-manager-ci-4578-0-0-p-96753e66ce\" (UID: \"21692cf7dfa799cca6b877e5612fce31\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:05.516242 kubelet[2905]: I0114 01:41:05.516021 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/21692cf7dfa799cca6b877e5612fce31-flexvolume-dir\") pod \"kube-controller-manager-ci-4578-0-0-p-96753e66ce\" (UID: \"21692cf7dfa799cca6b877e5612fce31\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:05.516242 kubelet[2905]: I0114 01:41:05.516141 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/21692cf7dfa799cca6b877e5612fce31-k8s-certs\") pod \"kube-controller-manager-ci-4578-0-0-p-96753e66ce\" (UID: \"21692cf7dfa799cca6b877e5612fce31\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:05.516242 kubelet[2905]: I0114 01:41:05.516193 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/21692cf7dfa799cca6b877e5612fce31-kubeconfig\") pod \"kube-controller-manager-ci-4578-0-0-p-96753e66ce\" (UID: \"21692cf7dfa799cca6b877e5612fce31\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:05.516336 kubelet[2905]: I0114 01:41:05.516240 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/21692cf7dfa799cca6b877e5612fce31-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4578-0-0-p-96753e66ce\" (UID: \"21692cf7dfa799cca6b877e5612fce31\") " pod="kube-system/kube-controller-manager-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:05.516336 kubelet[2905]: I0114 01:41:05.516269 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/dd26776522857ef53a2210b52507be51-kubeconfig\") pod \"kube-scheduler-ci-4578-0-0-p-96753e66ce\" (UID: \"dd26776522857ef53a2210b52507be51\") " pod="kube-system/kube-scheduler-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:06.182943 kubelet[2905]: I0114 01:41:06.182857 2905 apiserver.go:52] "Watching apiserver" Jan 14 01:41:06.215692 kubelet[2905]: I0114 01:41:06.215650 2905 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 14 01:41:06.243322 kubelet[2905]: I0114 01:41:06.243285 2905 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:06.243445 kubelet[2905]: I0114 01:41:06.243286 2905 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:06.250445 kubelet[2905]: E0114 01:41:06.250410 2905 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4578-0-0-p-96753e66ce\" already exists" pod="kube-system/kube-scheduler-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:06.250726 kubelet[2905]: E0114 01:41:06.250695 2905 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4578-0-0-p-96753e66ce\" already exists" pod="kube-system/kube-apiserver-ci-4578-0-0-p-96753e66ce" Jan 14 01:41:06.270298 kubelet[2905]: I0114 01:41:06.270081 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4578-0-0-p-96753e66ce" podStartSLOduration=1.270064165 podStartE2EDuration="1.270064165s" podCreationTimestamp="2026-01-14 01:41:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:41:06.269803365 +0000 UTC m=+1.143289400" watchObservedRunningTime="2026-01-14 01:41:06.270064165 +0000 UTC m=+1.143550240" Jan 14 01:41:06.270298 kubelet[2905]: I0114 01:41:06.270211 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4578-0-0-p-96753e66ce" podStartSLOduration=1.270205846 podStartE2EDuration="1.270205846s" podCreationTimestamp="2026-01-14 01:41:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:41:06.262142705 +0000 UTC m=+1.135628780" watchObservedRunningTime="2026-01-14 01:41:06.270205846 +0000 UTC m=+1.143691921" Jan 14 01:41:06.279018 kubelet[2905]: I0114 01:41:06.278899 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4578-0-0-p-96753e66ce" podStartSLOduration=1.2788858269999999 podStartE2EDuration="1.278885827s" podCreationTimestamp="2026-01-14 01:41:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:41:06.278102345 +0000 UTC m=+1.151588420" watchObservedRunningTime="2026-01-14 01:41:06.278885827 +0000 UTC m=+1.152371902" Jan 14 01:41:07.324754 update_engine[1654]: I20260114 01:41:07.324437 1654 update_attempter.cc:509] Updating boot flags... Jan 14 01:41:11.357477 kubelet[2905]: I0114 01:41:11.357370 2905 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 14 01:41:11.357864 containerd[1676]: time="2026-01-14T01:41:11.357757447Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 14 01:41:11.358037 kubelet[2905]: I0114 01:41:11.357977 2905 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 14 01:41:12.462411 systemd[1]: Created slice kubepods-besteffort-pode174679c_5595_4ba3_965a_feaab96d6b70.slice - libcontainer container kubepods-besteffort-pode174679c_5595_4ba3_965a_feaab96d6b70.slice. Jan 14 01:41:12.559023 kubelet[2905]: I0114 01:41:12.558971 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7lxw\" (UniqueName: \"kubernetes.io/projected/771e9332-cf06-40d6-9db7-e0ca09e8572b-kube-api-access-c7lxw\") pod \"tigera-operator-7dcd859c48-qllzk\" (UID: \"771e9332-cf06-40d6-9db7-e0ca09e8572b\") " pod="tigera-operator/tigera-operator-7dcd859c48-qllzk" Jan 14 01:41:12.560947 kubelet[2905]: I0114 01:41:12.559365 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/e174679c-5595-4ba3-965a-feaab96d6b70-kube-proxy\") pod \"kube-proxy-tpmjk\" (UID: \"e174679c-5595-4ba3-965a-feaab96d6b70\") " pod="kube-system/kube-proxy-tpmjk" Jan 14 01:41:12.560947 kubelet[2905]: I0114 01:41:12.559400 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e174679c-5595-4ba3-965a-feaab96d6b70-lib-modules\") pod \"kube-proxy-tpmjk\" (UID: \"e174679c-5595-4ba3-965a-feaab96d6b70\") " pod="kube-system/kube-proxy-tpmjk" Jan 14 01:41:12.560947 kubelet[2905]: I0114 01:41:12.559415 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/771e9332-cf06-40d6-9db7-e0ca09e8572b-var-lib-calico\") pod \"tigera-operator-7dcd859c48-qllzk\" (UID: \"771e9332-cf06-40d6-9db7-e0ca09e8572b\") " pod="tigera-operator/tigera-operator-7dcd859c48-qllzk" Jan 14 01:41:12.560947 kubelet[2905]: I0114 01:41:12.559439 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dgw4\" (UniqueName: \"kubernetes.io/projected/e174679c-5595-4ba3-965a-feaab96d6b70-kube-api-access-2dgw4\") pod \"kube-proxy-tpmjk\" (UID: \"e174679c-5595-4ba3-965a-feaab96d6b70\") " pod="kube-system/kube-proxy-tpmjk" Jan 14 01:41:12.560947 kubelet[2905]: I0114 01:41:12.559459 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e174679c-5595-4ba3-965a-feaab96d6b70-xtables-lock\") pod \"kube-proxy-tpmjk\" (UID: \"e174679c-5595-4ba3-965a-feaab96d6b70\") " pod="kube-system/kube-proxy-tpmjk" Jan 14 01:41:12.559337 systemd[1]: Created slice kubepods-besteffort-pod771e9332_cf06_40d6_9db7_e0ca09e8572b.slice - libcontainer container kubepods-besteffort-pod771e9332_cf06_40d6_9db7_e0ca09e8572b.slice. Jan 14 01:41:12.770133 containerd[1676]: time="2026-01-14T01:41:12.770065830Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-tpmjk,Uid:e174679c-5595-4ba3-965a-feaab96d6b70,Namespace:kube-system,Attempt:0,}" Jan 14 01:41:12.789325 containerd[1676]: time="2026-01-14T01:41:12.789284238Z" level=info msg="connecting to shim 001af3fc2bae1107949017b50496793bf188e3734ca5a94d720017da270986c7" address="unix:///run/containerd/s/dc7d1d0cf494f1ed561b95773530eb8dadf2e2aaa18b7b0907a24342b64adf9e" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:41:12.816101 systemd[1]: Started cri-containerd-001af3fc2bae1107949017b50496793bf188e3734ca5a94d720017da270986c7.scope - libcontainer container 001af3fc2bae1107949017b50496793bf188e3734ca5a94d720017da270986c7. Jan 14 01:41:12.822000 audit: BPF prog-id=133 op=LOAD Jan 14 01:41:12.825393 kernel: kauditd_printk_skb: 164 callbacks suppressed Jan 14 01:41:12.825450 kernel: audit: type=1334 audit(1768354872.822:437): prog-id=133 op=LOAD Jan 14 01:41:12.824000 audit: BPF prog-id=134 op=LOAD Jan 14 01:41:12.827370 kernel: audit: type=1334 audit(1768354872.824:438): prog-id=134 op=LOAD Jan 14 01:41:12.827400 kernel: audit: type=1300 audit(1768354872.824:438): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2985 pid=2996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.824000 audit[2996]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2985 pid=2996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.824000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316166336663326261653131303739343930313762353034393637 Jan 14 01:41:12.835359 kernel: audit: type=1327 audit(1768354872.824:438): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316166336663326261653131303739343930313762353034393637 Jan 14 01:41:12.835476 kernel: audit: type=1334 audit(1768354872.824:439): prog-id=134 op=UNLOAD Jan 14 01:41:12.824000 audit: BPF prog-id=134 op=UNLOAD Jan 14 01:41:12.824000 audit[2996]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2985 pid=2996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.839984 kernel: audit: type=1300 audit(1768354872.824:439): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2985 pid=2996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.824000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316166336663326261653131303739343930313762353034393637 Jan 14 01:41:12.843775 kernel: audit: type=1327 audit(1768354872.824:439): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316166336663326261653131303739343930313762353034393637 Jan 14 01:41:12.844217 kernel: audit: type=1334 audit(1768354872.824:440): prog-id=135 op=LOAD Jan 14 01:41:12.824000 audit: BPF prog-id=135 op=LOAD Jan 14 01:41:12.844849 kernel: audit: type=1300 audit(1768354872.824:440): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2985 pid=2996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.824000 audit[2996]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2985 pid=2996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.848212 kernel: audit: type=1327 audit(1768354872.824:440): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316166336663326261653131303739343930313762353034393637 Jan 14 01:41:12.824000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316166336663326261653131303739343930313762353034393637 Jan 14 01:41:12.825000 audit: BPF prog-id=136 op=LOAD Jan 14 01:41:12.825000 audit[2996]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2985 pid=2996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.825000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316166336663326261653131303739343930313762353034393637 Jan 14 01:41:12.830000 audit: BPF prog-id=136 op=UNLOAD Jan 14 01:41:12.830000 audit[2996]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2985 pid=2996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.830000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316166336663326261653131303739343930313762353034393637 Jan 14 01:41:12.830000 audit: BPF prog-id=135 op=UNLOAD Jan 14 01:41:12.830000 audit[2996]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2985 pid=2996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.830000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316166336663326261653131303739343930313762353034393637 Jan 14 01:41:12.830000 audit: BPF prog-id=137 op=LOAD Jan 14 01:41:12.830000 audit[2996]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2985 pid=2996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.830000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030316166336663326261653131303739343930313762353034393637 Jan 14 01:41:12.864643 containerd[1676]: time="2026-01-14T01:41:12.864596267Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-qllzk,Uid:771e9332-cf06-40d6-9db7-e0ca09e8572b,Namespace:tigera-operator,Attempt:0,}" Jan 14 01:41:12.865149 containerd[1676]: time="2026-01-14T01:41:12.865114788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-tpmjk,Uid:e174679c-5595-4ba3-965a-feaab96d6b70,Namespace:kube-system,Attempt:0,} returns sandbox id \"001af3fc2bae1107949017b50496793bf188e3734ca5a94d720017da270986c7\"" Jan 14 01:41:12.870845 containerd[1676]: time="2026-01-14T01:41:12.870810923Z" level=info msg="CreateContainer within sandbox \"001af3fc2bae1107949017b50496793bf188e3734ca5a94d720017da270986c7\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 14 01:41:12.887132 containerd[1676]: time="2026-01-14T01:41:12.887015923Z" level=info msg="Container e999a8f72b3fd88fc993ad4c242671d12b049e71f8d26266803c56917d2455f9: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:41:12.887132 containerd[1676]: time="2026-01-14T01:41:12.887072203Z" level=info msg="connecting to shim edcc7f32f2e0ed11b4938dde7cada78d8a3ed5c68b3af018761b4aa2009ba1c4" address="unix:///run/containerd/s/588911ba48772895c681676704d18d1534961b186f9fc1d625b50572b007f43d" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:41:12.896482 containerd[1676]: time="2026-01-14T01:41:12.896436987Z" level=info msg="CreateContainer within sandbox \"001af3fc2bae1107949017b50496793bf188e3734ca5a94d720017da270986c7\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"e999a8f72b3fd88fc993ad4c242671d12b049e71f8d26266803c56917d2455f9\"" Jan 14 01:41:12.898203 containerd[1676]: time="2026-01-14T01:41:12.897961751Z" level=info msg="StartContainer for \"e999a8f72b3fd88fc993ad4c242671d12b049e71f8d26266803c56917d2455f9\"" Jan 14 01:41:12.900648 containerd[1676]: time="2026-01-14T01:41:12.900618677Z" level=info msg="connecting to shim e999a8f72b3fd88fc993ad4c242671d12b049e71f8d26266803c56917d2455f9" address="unix:///run/containerd/s/dc7d1d0cf494f1ed561b95773530eb8dadf2e2aaa18b7b0907a24342b64adf9e" protocol=ttrpc version=3 Jan 14 01:41:12.914936 systemd[1]: Started cri-containerd-edcc7f32f2e0ed11b4938dde7cada78d8a3ed5c68b3af018761b4aa2009ba1c4.scope - libcontainer container edcc7f32f2e0ed11b4938dde7cada78d8a3ed5c68b3af018761b4aa2009ba1c4. Jan 14 01:41:12.922954 systemd[1]: Started cri-containerd-e999a8f72b3fd88fc993ad4c242671d12b049e71f8d26266803c56917d2455f9.scope - libcontainer container e999a8f72b3fd88fc993ad4c242671d12b049e71f8d26266803c56917d2455f9. Jan 14 01:41:12.927000 audit: BPF prog-id=138 op=LOAD Jan 14 01:41:12.928000 audit: BPF prog-id=139 op=LOAD Jan 14 01:41:12.928000 audit[3040]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=3029 pid=3040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.928000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564636337663332663265306564313162343933386464653763616461 Jan 14 01:41:12.928000 audit: BPF prog-id=139 op=UNLOAD Jan 14 01:41:12.928000 audit[3040]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3029 pid=3040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.928000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564636337663332663265306564313162343933386464653763616461 Jan 14 01:41:12.928000 audit: BPF prog-id=140 op=LOAD Jan 14 01:41:12.928000 audit[3040]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3029 pid=3040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.928000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564636337663332663265306564313162343933386464653763616461 Jan 14 01:41:12.929000 audit: BPF prog-id=141 op=LOAD Jan 14 01:41:12.929000 audit[3040]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3029 pid=3040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.929000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564636337663332663265306564313162343933386464653763616461 Jan 14 01:41:12.929000 audit: BPF prog-id=141 op=UNLOAD Jan 14 01:41:12.929000 audit[3040]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3029 pid=3040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.929000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564636337663332663265306564313162343933386464653763616461 Jan 14 01:41:12.929000 audit: BPF prog-id=140 op=UNLOAD Jan 14 01:41:12.929000 audit[3040]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3029 pid=3040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.929000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564636337663332663265306564313162343933386464653763616461 Jan 14 01:41:12.929000 audit: BPF prog-id=142 op=LOAD Jan 14 01:41:12.929000 audit[3040]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3029 pid=3040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.929000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564636337663332663265306564313162343933386464653763616461 Jan 14 01:41:12.957060 containerd[1676]: time="2026-01-14T01:41:12.956946499Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-qllzk,Uid:771e9332-cf06-40d6-9db7-e0ca09e8572b,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"edcc7f32f2e0ed11b4938dde7cada78d8a3ed5c68b3af018761b4aa2009ba1c4\"" Jan 14 01:41:12.959127 containerd[1676]: time="2026-01-14T01:41:12.959092944Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 14 01:41:12.964000 audit: BPF prog-id=143 op=LOAD Jan 14 01:41:12.964000 audit[3052]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=2985 pid=3052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.964000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539393961386637326233666438386663393933616434633234323637 Jan 14 01:41:12.964000 audit: BPF prog-id=144 op=LOAD Jan 14 01:41:12.964000 audit[3052]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=2985 pid=3052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.964000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539393961386637326233666438386663393933616434633234323637 Jan 14 01:41:12.964000 audit: BPF prog-id=144 op=UNLOAD Jan 14 01:41:12.964000 audit[3052]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2985 pid=3052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.964000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539393961386637326233666438386663393933616434633234323637 Jan 14 01:41:12.964000 audit: BPF prog-id=143 op=UNLOAD Jan 14 01:41:12.964000 audit[3052]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2985 pid=3052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.964000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539393961386637326233666438386663393933616434633234323637 Jan 14 01:41:12.964000 audit: BPF prog-id=145 op=LOAD Jan 14 01:41:12.964000 audit[3052]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=2985 pid=3052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:12.964000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6539393961386637326233666438386663393933616434633234323637 Jan 14 01:41:12.985425 containerd[1676]: time="2026-01-14T01:41:12.985385490Z" level=info msg="StartContainer for \"e999a8f72b3fd88fc993ad4c242671d12b049e71f8d26266803c56917d2455f9\" returns successfully" Jan 14 01:41:13.140000 audit[3130]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3130 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.140000 audit[3130]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff4bcaea0 a2=0 a3=1 items=0 ppid=3072 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.140000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 14 01:41:13.140000 audit[3132]: NETFILTER_CFG table=mangle:55 family=10 entries=1 op=nft_register_chain pid=3132 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.140000 audit[3132]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffffce42d70 a2=0 a3=1 items=0 ppid=3072 pid=3132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.140000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 14 01:41:13.141000 audit[3135]: NETFILTER_CFG table=nat:56 family=10 entries=1 op=nft_register_chain pid=3135 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.141000 audit[3135]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffc8326a0 a2=0 a3=1 items=0 ppid=3072 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.141000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 14 01:41:13.142000 audit[3134]: NETFILTER_CFG table=nat:57 family=2 entries=1 op=nft_register_chain pid=3134 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.142000 audit[3134]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd2d80b60 a2=0 a3=1 items=0 ppid=3072 pid=3134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.142000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 14 01:41:13.144000 audit[3137]: NETFILTER_CFG table=filter:58 family=2 entries=1 op=nft_register_chain pid=3137 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.144000 audit[3137]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff606ed90 a2=0 a3=1 items=0 ppid=3072 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.144000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 14 01:41:13.146000 audit[3138]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3138 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.146000 audit[3138]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe9fb91e0 a2=0 a3=1 items=0 ppid=3072 pid=3138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.146000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 14 01:41:13.244000 audit[3139]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3139 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.244000 audit[3139]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffde19eb10 a2=0 a3=1 items=0 ppid=3072 pid=3139 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.244000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 14 01:41:13.246000 audit[3141]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3141 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.246000 audit[3141]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=fffff8dffc10 a2=0 a3=1 items=0 ppid=3072 pid=3141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.246000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 14 01:41:13.250000 audit[3144]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3144 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.250000 audit[3144]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffdbc8c880 a2=0 a3=1 items=0 ppid=3072 pid=3144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.250000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 14 01:41:13.251000 audit[3145]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3145 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.251000 audit[3145]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffcd2faf50 a2=0 a3=1 items=0 ppid=3072 pid=3145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.251000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 14 01:41:13.254000 audit[3147]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3147 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.254000 audit[3147]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffd1485860 a2=0 a3=1 items=0 ppid=3072 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.254000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 14 01:41:13.256000 audit[3148]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3148 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.256000 audit[3148]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffee2dcfc0 a2=0 a3=1 items=0 ppid=3072 pid=3148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.256000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 14 01:41:13.260000 audit[3150]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3150 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.260000 audit[3150]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffc0954a40 a2=0 a3=1 items=0 ppid=3072 pid=3150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.260000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 14 01:41:13.264000 audit[3153]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3153 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.264000 audit[3153]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffe59591a0 a2=0 a3=1 items=0 ppid=3072 pid=3153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.264000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 14 01:41:13.268008 kubelet[2905]: I0114 01:41:13.267941 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-tpmjk" podStartSLOduration=1.267925119 podStartE2EDuration="1.267925119s" podCreationTimestamp="2026-01-14 01:41:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:41:13.267655958 +0000 UTC m=+8.141142113" watchObservedRunningTime="2026-01-14 01:41:13.267925119 +0000 UTC m=+8.141411154" Jan 14 01:41:13.266000 audit[3154]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3154 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.266000 audit[3154]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd7f7dbf0 a2=0 a3=1 items=0 ppid=3072 pid=3154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.266000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 14 01:41:13.269000 audit[3156]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3156 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.269000 audit[3156]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffed5d1ca0 a2=0 a3=1 items=0 ppid=3072 pid=3156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.269000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 14 01:41:13.271000 audit[3157]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3157 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.271000 audit[3157]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffffca19910 a2=0 a3=1 items=0 ppid=3072 pid=3157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.271000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 14 01:41:13.274000 audit[3159]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3159 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.274000 audit[3159]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffffc13cec0 a2=0 a3=1 items=0 ppid=3072 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.274000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 14 01:41:13.280000 audit[3162]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3162 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.280000 audit[3162]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffcf734bc0 a2=0 a3=1 items=0 ppid=3072 pid=3162 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.280000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 14 01:41:13.283000 audit[3165]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3165 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.283000 audit[3165]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff74bc940 a2=0 a3=1 items=0 ppid=3072 pid=3165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.283000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 14 01:41:13.284000 audit[3166]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3166 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.284000 audit[3166]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd7b8e640 a2=0 a3=1 items=0 ppid=3072 pid=3166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.284000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 14 01:41:13.287000 audit[3168]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3168 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.287000 audit[3168]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffd14eb0e0 a2=0 a3=1 items=0 ppid=3072 pid=3168 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.287000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:41:13.290000 audit[3171]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3171 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.290000 audit[3171]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffca01b480 a2=0 a3=1 items=0 ppid=3072 pid=3171 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.290000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:41:13.291000 audit[3172]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3172 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.291000 audit[3172]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff6487f10 a2=0 a3=1 items=0 ppid=3072 pid=3172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.291000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 14 01:41:13.294000 audit[3174]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3174 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 01:41:13.294000 audit[3174]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=ffffe3be32a0 a2=0 a3=1 items=0 ppid=3072 pid=3174 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.294000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 14 01:41:13.316000 audit[3180]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3180 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:13.316000 audit[3180]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffc8efaee0 a2=0 a3=1 items=0 ppid=3072 pid=3180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.316000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:13.326000 audit[3180]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3180 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:13.326000 audit[3180]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=ffffc8efaee0 a2=0 a3=1 items=0 ppid=3072 pid=3180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.326000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:13.328000 audit[3185]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3185 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.328000 audit[3185]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=fffff82e85e0 a2=0 a3=1 items=0 ppid=3072 pid=3185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.328000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 14 01:41:13.331000 audit[3187]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3187 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.331000 audit[3187]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=ffffe6bcb7d0 a2=0 a3=1 items=0 ppid=3072 pid=3187 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.331000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 14 01:41:13.335000 audit[3190]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3190 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.335000 audit[3190]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffca360970 a2=0 a3=1 items=0 ppid=3072 pid=3190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.335000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 14 01:41:13.336000 audit[3191]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3191 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.336000 audit[3191]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffc6633b0 a2=0 a3=1 items=0 ppid=3072 pid=3191 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.336000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 14 01:41:13.339000 audit[3193]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3193 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.339000 audit[3193]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff4010240 a2=0 a3=1 items=0 ppid=3072 pid=3193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.339000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 14 01:41:13.340000 audit[3194]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3194 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.340000 audit[3194]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc8e64fe0 a2=0 a3=1 items=0 ppid=3072 pid=3194 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.340000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 14 01:41:13.342000 audit[3196]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3196 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.342000 audit[3196]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=fffff2bfc8b0 a2=0 a3=1 items=0 ppid=3072 pid=3196 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.342000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 14 01:41:13.346000 audit[3199]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3199 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.346000 audit[3199]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=ffffdfac97d0 a2=0 a3=1 items=0 ppid=3072 pid=3199 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.346000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 14 01:41:13.347000 audit[3200]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3200 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.347000 audit[3200]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffcc7a750 a2=0 a3=1 items=0 ppid=3072 pid=3200 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.347000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 14 01:41:13.350000 audit[3202]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3202 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.350000 audit[3202]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffcfa15570 a2=0 a3=1 items=0 ppid=3072 pid=3202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.350000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 14 01:41:13.351000 audit[3203]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3203 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.351000 audit[3203]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe145bce0 a2=0 a3=1 items=0 ppid=3072 pid=3203 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.351000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 14 01:41:13.354000 audit[3205]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3205 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.354000 audit[3205]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd0acf6d0 a2=0 a3=1 items=0 ppid=3072 pid=3205 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.354000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 14 01:41:13.359000 audit[3208]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3208 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.359000 audit[3208]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffe47806b0 a2=0 a3=1 items=0 ppid=3072 pid=3208 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.359000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 14 01:41:13.364000 audit[3211]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3211 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.364000 audit[3211]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd8ac8410 a2=0 a3=1 items=0 ppid=3072 pid=3211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.364000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 14 01:41:13.365000 audit[3212]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3212 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.365000 audit[3212]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd5ef5580 a2=0 a3=1 items=0 ppid=3072 pid=3212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.365000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 14 01:41:13.367000 audit[3214]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3214 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.367000 audit[3214]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffcaee27c0 a2=0 a3=1 items=0 ppid=3072 pid=3214 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.367000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:41:13.371000 audit[3217]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3217 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.371000 audit[3217]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff716bf60 a2=0 a3=1 items=0 ppid=3072 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.371000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 01:41:13.372000 audit[3218]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3218 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.372000 audit[3218]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd46a7350 a2=0 a3=1 items=0 ppid=3072 pid=3218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.372000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 14 01:41:13.374000 audit[3220]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3220 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.374000 audit[3220]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=ffffcc47bd60 a2=0 a3=1 items=0 ppid=3072 pid=3220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.374000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 14 01:41:13.375000 audit[3221]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3221 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.375000 audit[3221]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe522f100 a2=0 a3=1 items=0 ppid=3072 pid=3221 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.375000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 14 01:41:13.378000 audit[3223]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3223 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.378000 audit[3223]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffc53e1110 a2=0 a3=1 items=0 ppid=3072 pid=3223 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.378000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 01:41:13.381000 audit[3226]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3226 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 01:41:13.381000 audit[3226]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffc57f54b0 a2=0 a3=1 items=0 ppid=3072 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.381000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 01:41:13.384000 audit[3228]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3228 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 14 01:41:13.384000 audit[3228]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2088 a0=3 a1=ffffe0d4bed0 a2=0 a3=1 items=0 ppid=3072 pid=3228 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.384000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:13.385000 audit[3228]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3228 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 14 01:41:13.385000 audit[3228]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=ffffe0d4bed0 a2=0 a3=1 items=0 ppid=3072 pid=3228 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:13.385000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:14.935502 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2736656707.mount: Deactivated successfully. Jan 14 01:41:15.206827 containerd[1676]: time="2026-01-14T01:41:15.206362861Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:41:15.207326 containerd[1676]: time="2026-01-14T01:41:15.207290944Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=0" Jan 14 01:41:15.208239 containerd[1676]: time="2026-01-14T01:41:15.208218146Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:41:15.210392 containerd[1676]: time="2026-01-14T01:41:15.210340511Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:41:15.210863 containerd[1676]: time="2026-01-14T01:41:15.210839193Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 2.251713569s" Jan 14 01:41:15.210895 containerd[1676]: time="2026-01-14T01:41:15.210872073Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Jan 14 01:41:15.216290 containerd[1676]: time="2026-01-14T01:41:15.216248726Z" level=info msg="CreateContainer within sandbox \"edcc7f32f2e0ed11b4938dde7cada78d8a3ed5c68b3af018761b4aa2009ba1c4\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 14 01:41:15.222441 containerd[1676]: time="2026-01-14T01:41:15.222371541Z" level=info msg="Container bcb9dc991096a1ad1791f0f1ebf07bbe7d362032306ddb8926c42c7986f444f5: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:41:15.226482 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2278596593.mount: Deactivated successfully. Jan 14 01:41:15.231931 containerd[1676]: time="2026-01-14T01:41:15.231805845Z" level=info msg="CreateContainer within sandbox \"edcc7f32f2e0ed11b4938dde7cada78d8a3ed5c68b3af018761b4aa2009ba1c4\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"bcb9dc991096a1ad1791f0f1ebf07bbe7d362032306ddb8926c42c7986f444f5\"" Jan 14 01:41:15.232231 containerd[1676]: time="2026-01-14T01:41:15.232209086Z" level=info msg="StartContainer for \"bcb9dc991096a1ad1791f0f1ebf07bbe7d362032306ddb8926c42c7986f444f5\"" Jan 14 01:41:15.233185 containerd[1676]: time="2026-01-14T01:41:15.233160129Z" level=info msg="connecting to shim bcb9dc991096a1ad1791f0f1ebf07bbe7d362032306ddb8926c42c7986f444f5" address="unix:///run/containerd/s/588911ba48772895c681676704d18d1534961b186f9fc1d625b50572b007f43d" protocol=ttrpc version=3 Jan 14 01:41:15.256039 systemd[1]: Started cri-containerd-bcb9dc991096a1ad1791f0f1ebf07bbe7d362032306ddb8926c42c7986f444f5.scope - libcontainer container bcb9dc991096a1ad1791f0f1ebf07bbe7d362032306ddb8926c42c7986f444f5. Jan 14 01:41:15.266000 audit: BPF prog-id=146 op=LOAD Jan 14 01:41:15.267000 audit: BPF prog-id=147 op=LOAD Jan 14 01:41:15.267000 audit[3237]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3029 pid=3237 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:15.267000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263623964633939313039366131616431373931663066316562663037 Jan 14 01:41:15.267000 audit: BPF prog-id=147 op=UNLOAD Jan 14 01:41:15.267000 audit[3237]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3029 pid=3237 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:15.267000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263623964633939313039366131616431373931663066316562663037 Jan 14 01:41:15.267000 audit: BPF prog-id=148 op=LOAD Jan 14 01:41:15.267000 audit[3237]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3029 pid=3237 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:15.267000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263623964633939313039366131616431373931663066316562663037 Jan 14 01:41:15.267000 audit: BPF prog-id=149 op=LOAD Jan 14 01:41:15.267000 audit[3237]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3029 pid=3237 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:15.267000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263623964633939313039366131616431373931663066316562663037 Jan 14 01:41:15.267000 audit: BPF prog-id=149 op=UNLOAD Jan 14 01:41:15.267000 audit[3237]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3029 pid=3237 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:15.267000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263623964633939313039366131616431373931663066316562663037 Jan 14 01:41:15.267000 audit: BPF prog-id=148 op=UNLOAD Jan 14 01:41:15.267000 audit[3237]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3029 pid=3237 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:15.267000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263623964633939313039366131616431373931663066316562663037 Jan 14 01:41:15.267000 audit: BPF prog-id=150 op=LOAD Jan 14 01:41:15.267000 audit[3237]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3029 pid=3237 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:15.267000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263623964633939313039366131616431373931663066316562663037 Jan 14 01:41:15.290493 containerd[1676]: time="2026-01-14T01:41:15.290457432Z" level=info msg="StartContainer for \"bcb9dc991096a1ad1791f0f1ebf07bbe7d362032306ddb8926c42c7986f444f5\" returns successfully" Jan 14 01:41:16.276703 kubelet[2905]: I0114 01:41:16.276276 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-qllzk" podStartSLOduration=2.023139013 podStartE2EDuration="4.276260225s" podCreationTimestamp="2026-01-14 01:41:12 +0000 UTC" firstStartedPulling="2026-01-14 01:41:12.958453182 +0000 UTC m=+7.831939257" lastFinishedPulling="2026-01-14 01:41:15.211574394 +0000 UTC m=+10.085060469" observedRunningTime="2026-01-14 01:41:16.275630424 +0000 UTC m=+11.149116459" watchObservedRunningTime="2026-01-14 01:41:16.276260225 +0000 UTC m=+11.149746300" Jan 14 01:41:17.317931 systemd[1]: cri-containerd-bcb9dc991096a1ad1791f0f1ebf07bbe7d362032306ddb8926c42c7986f444f5.scope: Deactivated successfully. Jan 14 01:41:17.322738 containerd[1676]: time="2026-01-14T01:41:17.322691090Z" level=info msg="received container exit event container_id:\"bcb9dc991096a1ad1791f0f1ebf07bbe7d362032306ddb8926c42c7986f444f5\" id:\"bcb9dc991096a1ad1791f0f1ebf07bbe7d362032306ddb8926c42c7986f444f5\" pid:3250 exit_status:1 exited_at:{seconds:1768354877 nanos:322247769}" Jan 14 01:41:17.322000 audit: BPF prog-id=146 op=UNLOAD Jan 14 01:41:17.322000 audit: BPF prog-id=150 op=UNLOAD Jan 14 01:41:17.361097 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bcb9dc991096a1ad1791f0f1ebf07bbe7d362032306ddb8926c42c7986f444f5-rootfs.mount: Deactivated successfully. Jan 14 01:41:18.274326 kubelet[2905]: I0114 01:41:18.274277 2905 scope.go:117] "RemoveContainer" containerID="bcb9dc991096a1ad1791f0f1ebf07bbe7d362032306ddb8926c42c7986f444f5" Jan 14 01:41:18.276422 containerd[1676]: time="2026-01-14T01:41:18.276374043Z" level=info msg="CreateContainer within sandbox \"edcc7f32f2e0ed11b4938dde7cada78d8a3ed5c68b3af018761b4aa2009ba1c4\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 14 01:41:18.291914 containerd[1676]: time="2026-01-14T01:41:18.291863681Z" level=info msg="Container b6ab4df23c040366dc69c84b2a8511db28ca7078235904ae9790ea5b1cb8081a: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:41:18.298549 containerd[1676]: time="2026-01-14T01:41:18.298508898Z" level=info msg="CreateContainer within sandbox \"edcc7f32f2e0ed11b4938dde7cada78d8a3ed5c68b3af018761b4aa2009ba1c4\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"b6ab4df23c040366dc69c84b2a8511db28ca7078235904ae9790ea5b1cb8081a\"" Jan 14 01:41:18.299732 containerd[1676]: time="2026-01-14T01:41:18.299492181Z" level=info msg="StartContainer for \"b6ab4df23c040366dc69c84b2a8511db28ca7078235904ae9790ea5b1cb8081a\"" Jan 14 01:41:18.300741 containerd[1676]: time="2026-01-14T01:41:18.300499783Z" level=info msg="connecting to shim b6ab4df23c040366dc69c84b2a8511db28ca7078235904ae9790ea5b1cb8081a" address="unix:///run/containerd/s/588911ba48772895c681676704d18d1534961b186f9fc1d625b50572b007f43d" protocol=ttrpc version=3 Jan 14 01:41:18.328943 systemd[1]: Started cri-containerd-b6ab4df23c040366dc69c84b2a8511db28ca7078235904ae9790ea5b1cb8081a.scope - libcontainer container b6ab4df23c040366dc69c84b2a8511db28ca7078235904ae9790ea5b1cb8081a. Jan 14 01:41:18.337000 audit: BPF prog-id=151 op=LOAD Jan 14 01:41:18.340272 kernel: kauditd_printk_skb: 226 callbacks suppressed Jan 14 01:41:18.340355 kernel: audit: type=1334 audit(1768354878.337:519): prog-id=151 op=LOAD Jan 14 01:41:18.338000 audit: BPF prog-id=152 op=LOAD Jan 14 01:41:18.341909 kernel: audit: type=1334 audit(1768354878.338:520): prog-id=152 op=LOAD Jan 14 01:41:18.342000 kernel: audit: type=1300 audit(1768354878.338:520): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400019c180 a2=98 a3=0 items=0 ppid=3029 pid=3317 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:18.338000 audit[3317]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400019c180 a2=98 a3=0 items=0 ppid=3029 pid=3317 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:18.338000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236616234646632336330343033363664633639633834623261383531 Jan 14 01:41:18.348759 kernel: audit: type=1327 audit(1768354878.338:520): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236616234646632336330343033363664633639633834623261383531 Jan 14 01:41:18.348856 kernel: audit: type=1334 audit(1768354878.339:521): prog-id=152 op=UNLOAD Jan 14 01:41:18.339000 audit: BPF prog-id=152 op=UNLOAD Jan 14 01:41:18.339000 audit[3317]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3029 pid=3317 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:18.353748 kernel: audit: type=1300 audit(1768354878.339:521): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3029 pid=3317 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:18.353833 kernel: audit: type=1327 audit(1768354878.339:521): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236616234646632336330343033363664633639633834623261383531 Jan 14 01:41:18.339000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236616234646632336330343033363664633639633834623261383531 Jan 14 01:41:18.339000 audit: BPF prog-id=153 op=LOAD Jan 14 01:41:18.357808 kernel: audit: type=1334 audit(1768354878.339:522): prog-id=153 op=LOAD Jan 14 01:41:18.357866 kernel: audit: type=1300 audit(1768354878.339:522): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400019c3e8 a2=98 a3=0 items=0 ppid=3029 pid=3317 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:18.339000 audit[3317]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400019c3e8 a2=98 a3=0 items=0 ppid=3029 pid=3317 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:18.339000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236616234646632336330343033363664633639633834623261383531 Jan 14 01:41:18.365125 kernel: audit: type=1327 audit(1768354878.339:522): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236616234646632336330343033363664633639633834623261383531 Jan 14 01:41:18.340000 audit: BPF prog-id=154 op=LOAD Jan 14 01:41:18.340000 audit[3317]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=400019c168 a2=98 a3=0 items=0 ppid=3029 pid=3317 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:18.340000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236616234646632336330343033363664633639633834623261383531 Jan 14 01:41:18.343000 audit: BPF prog-id=154 op=UNLOAD Jan 14 01:41:18.343000 audit[3317]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3029 pid=3317 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:18.343000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236616234646632336330343033363664633639633834623261383531 Jan 14 01:41:18.343000 audit: BPF prog-id=153 op=UNLOAD Jan 14 01:41:18.343000 audit[3317]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3029 pid=3317 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:18.343000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236616234646632336330343033363664633639633834623261383531 Jan 14 01:41:18.343000 audit: BPF prog-id=155 op=LOAD Jan 14 01:41:18.343000 audit[3317]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400019c648 a2=98 a3=0 items=0 ppid=3029 pid=3317 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:18.343000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236616234646632336330343033363664633639633834623261383531 Jan 14 01:41:18.378886 containerd[1676]: time="2026-01-14T01:41:18.378847300Z" level=info msg="StartContainer for \"b6ab4df23c040366dc69c84b2a8511db28ca7078235904ae9790ea5b1cb8081a\" returns successfully" Jan 14 01:41:20.438176 sudo[1948]: pam_unix(sudo:session): session closed for user root Jan 14 01:41:20.436000 audit[1948]: USER_END pid=1948 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:41:20.436000 audit[1948]: CRED_DISP pid=1948 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 01:41:20.537417 sshd[1947]: Connection closed by 4.153.228.146 port 38402 Jan 14 01:41:20.537928 sshd-session[1943]: pam_unix(sshd:session): session closed for user core Jan 14 01:41:20.537000 audit[1943]: USER_END pid=1943 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:41:20.537000 audit[1943]: CRED_DISP pid=1943 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:41:20.541869 systemd[1]: sshd@8-10.0.35.206:22-4.153.228.146:38402.service: Deactivated successfully. Jan 14 01:41:20.540000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.35.206:22-4.153.228.146:38402 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:41:20.543840 systemd[1]: session-10.scope: Deactivated successfully. Jan 14 01:41:20.544082 systemd[1]: session-10.scope: Consumed 7.597s CPU time, 222M memory peak. Jan 14 01:41:20.545058 systemd-logind[1651]: Session 10 logged out. Waiting for processes to exit. Jan 14 01:41:20.546337 systemd-logind[1651]: Removed session 10. Jan 14 01:41:22.701000 audit[3376]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3376 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:22.701000 audit[3376]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=fffff8106020 a2=0 a3=1 items=0 ppid=3072 pid=3376 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:22.701000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:22.720000 audit[3376]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3376 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:22.720000 audit[3376]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff8106020 a2=0 a3=1 items=0 ppid=3072 pid=3376 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:22.720000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:22.734000 audit[3378]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3378 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:22.734000 audit[3378]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffdfcae310 a2=0 a3=1 items=0 ppid=3072 pid=3378 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:22.734000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:22.741000 audit[3378]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3378 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:22.741000 audit[3378]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffdfcae310 a2=0 a3=1 items=0 ppid=3072 pid=3378 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:22.741000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:26.307000 audit[3381]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3381 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:26.310115 kernel: kauditd_printk_skb: 29 callbacks suppressed Jan 14 01:41:26.310188 kernel: audit: type=1325 audit(1768354886.307:536): table=filter:109 family=2 entries=17 op=nft_register_rule pid=3381 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:26.307000 audit[3381]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffffa1ff2a0 a2=0 a3=1 items=0 ppid=3072 pid=3381 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:26.316979 kernel: audit: type=1300 audit(1768354886.307:536): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffffa1ff2a0 a2=0 a3=1 items=0 ppid=3072 pid=3381 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:26.307000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:26.319251 kernel: audit: type=1327 audit(1768354886.307:536): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:26.316000 audit[3381]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3381 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:26.322213 kernel: audit: type=1325 audit(1768354886.316:537): table=nat:110 family=2 entries=12 op=nft_register_rule pid=3381 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:26.322274 kernel: audit: type=1300 audit(1768354886.316:537): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffffa1ff2a0 a2=0 a3=1 items=0 ppid=3072 pid=3381 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:26.316000 audit[3381]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffffa1ff2a0 a2=0 a3=1 items=0 ppid=3072 pid=3381 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:26.316000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:26.329605 kernel: audit: type=1327 audit(1768354886.316:537): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:26.336000 audit[3383]: NETFILTER_CFG table=filter:111 family=2 entries=18 op=nft_register_rule pid=3383 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:26.336000 audit[3383]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffff6ddce30 a2=0 a3=1 items=0 ppid=3072 pid=3383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:26.346077 kernel: audit: type=1325 audit(1768354886.336:538): table=filter:111 family=2 entries=18 op=nft_register_rule pid=3383 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:26.346318 kernel: audit: type=1300 audit(1768354886.336:538): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffff6ddce30 a2=0 a3=1 items=0 ppid=3072 pid=3383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:26.336000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:26.348726 kernel: audit: type=1327 audit(1768354886.336:538): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:26.348000 audit[3383]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3383 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:26.348000 audit[3383]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff6ddce30 a2=0 a3=1 items=0 ppid=3072 pid=3383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:26.352900 kernel: audit: type=1325 audit(1768354886.348:539): table=nat:112 family=2 entries=12 op=nft_register_rule pid=3383 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:26.348000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:27.359000 audit[3385]: NETFILTER_CFG table=filter:113 family=2 entries=19 op=nft_register_rule pid=3385 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:27.359000 audit[3385]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffe4952dd0 a2=0 a3=1 items=0 ppid=3072 pid=3385 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:27.359000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:27.368000 audit[3385]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3385 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:27.368000 audit[3385]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffe4952dd0 a2=0 a3=1 items=0 ppid=3072 pid=3385 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:27.368000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:28.999000 audit[3387]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3387 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:28.999000 audit[3387]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffc2f58ed0 a2=0 a3=1 items=0 ppid=3072 pid=3387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:28.999000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:29.009000 audit[3387]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3387 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:29.009000 audit[3387]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc2f58ed0 a2=0 a3=1 items=0 ppid=3072 pid=3387 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:29.009000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:29.026452 systemd[1]: Created slice kubepods-besteffort-podc5e8667e_c313_4435_a217_4843ea2638dd.slice - libcontainer container kubepods-besteffort-podc5e8667e_c313_4435_a217_4843ea2638dd.slice. Jan 14 01:41:29.056899 kubelet[2905]: I0114 01:41:29.056854 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/c5e8667e-c313-4435-a217-4843ea2638dd-typha-certs\") pod \"calico-typha-7d48b46ff6-zr4fg\" (UID: \"c5e8667e-c313-4435-a217-4843ea2638dd\") " pod="calico-system/calico-typha-7d48b46ff6-zr4fg" Jan 14 01:41:29.056899 kubelet[2905]: I0114 01:41:29.056903 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsztb\" (UniqueName: \"kubernetes.io/projected/c5e8667e-c313-4435-a217-4843ea2638dd-kube-api-access-zsztb\") pod \"calico-typha-7d48b46ff6-zr4fg\" (UID: \"c5e8667e-c313-4435-a217-4843ea2638dd\") " pod="calico-system/calico-typha-7d48b46ff6-zr4fg" Jan 14 01:41:29.057283 kubelet[2905]: I0114 01:41:29.056927 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5e8667e-c313-4435-a217-4843ea2638dd-tigera-ca-bundle\") pod \"calico-typha-7d48b46ff6-zr4fg\" (UID: \"c5e8667e-c313-4435-a217-4843ea2638dd\") " pod="calico-system/calico-typha-7d48b46ff6-zr4fg" Jan 14 01:41:29.215498 systemd[1]: Created slice kubepods-besteffort-podfb72bc86_858b_44eb_9dc8_ae8e58ac4d3a.slice - libcontainer container kubepods-besteffort-podfb72bc86_858b_44eb_9dc8_ae8e58ac4d3a.slice. Jan 14 01:41:29.259014 kubelet[2905]: I0114 01:41:29.258922 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a-var-run-calico\") pod \"calico-node-r5qs2\" (UID: \"fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a\") " pod="calico-system/calico-node-r5qs2" Jan 14 01:41:29.259014 kubelet[2905]: I0114 01:41:29.258994 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a-flexvol-driver-host\") pod \"calico-node-r5qs2\" (UID: \"fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a\") " pod="calico-system/calico-node-r5qs2" Jan 14 01:41:29.259143 kubelet[2905]: I0114 01:41:29.259048 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a-node-certs\") pod \"calico-node-r5qs2\" (UID: \"fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a\") " pod="calico-system/calico-node-r5qs2" Jan 14 01:41:29.259143 kubelet[2905]: I0114 01:41:29.259070 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a-cni-net-dir\") pod \"calico-node-r5qs2\" (UID: \"fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a\") " pod="calico-system/calico-node-r5qs2" Jan 14 01:41:29.259143 kubelet[2905]: I0114 01:41:29.259084 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a-cni-log-dir\") pod \"calico-node-r5qs2\" (UID: \"fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a\") " pod="calico-system/calico-node-r5qs2" Jan 14 01:41:29.259143 kubelet[2905]: I0114 01:41:29.259108 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a-tigera-ca-bundle\") pod \"calico-node-r5qs2\" (UID: \"fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a\") " pod="calico-system/calico-node-r5qs2" Jan 14 01:41:29.259143 kubelet[2905]: I0114 01:41:29.259123 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a-xtables-lock\") pod \"calico-node-r5qs2\" (UID: \"fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a\") " pod="calico-system/calico-node-r5qs2" Jan 14 01:41:29.259247 kubelet[2905]: I0114 01:41:29.259138 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a-var-lib-calico\") pod \"calico-node-r5qs2\" (UID: \"fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a\") " pod="calico-system/calico-node-r5qs2" Jan 14 01:41:29.259247 kubelet[2905]: I0114 01:41:29.259153 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a-cni-bin-dir\") pod \"calico-node-r5qs2\" (UID: \"fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a\") " pod="calico-system/calico-node-r5qs2" Jan 14 01:41:29.259247 kubelet[2905]: I0114 01:41:29.259174 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a-lib-modules\") pod \"calico-node-r5qs2\" (UID: \"fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a\") " pod="calico-system/calico-node-r5qs2" Jan 14 01:41:29.259247 kubelet[2905]: I0114 01:41:29.259197 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a-policysync\") pod \"calico-node-r5qs2\" (UID: \"fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a\") " pod="calico-system/calico-node-r5qs2" Jan 14 01:41:29.259247 kubelet[2905]: I0114 01:41:29.259215 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mslfl\" (UniqueName: \"kubernetes.io/projected/fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a-kube-api-access-mslfl\") pod \"calico-node-r5qs2\" (UID: \"fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a\") " pod="calico-system/calico-node-r5qs2" Jan 14 01:41:29.331691 containerd[1676]: time="2026-01-14T01:41:29.331475815Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7d48b46ff6-zr4fg,Uid:c5e8667e-c313-4435-a217-4843ea2638dd,Namespace:calico-system,Attempt:0,}" Jan 14 01:41:29.358312 kubelet[2905]: E0114 01:41:29.358179 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:41:29.363288 kubelet[2905]: E0114 01:41:29.361840 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.363288 kubelet[2905]: W0114 01:41:29.361866 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.363288 kubelet[2905]: E0114 01:41:29.361898 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.366067 containerd[1676]: time="2026-01-14T01:41:29.366020981Z" level=info msg="connecting to shim 2341be69217ea3d1190c6b275b6cd9546019fea07483145b135f5071288236bc" address="unix:///run/containerd/s/f271ac161b42d24d933f2d26f7dbf0da86de390d1c0d2cf9cfc5cba47a1742e0" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:41:29.367933 kubelet[2905]: E0114 01:41:29.367902 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.367933 kubelet[2905]: W0114 01:41:29.367927 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.368050 kubelet[2905]: E0114 01:41:29.367947 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.382713 kubelet[2905]: E0114 01:41:29.382631 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.382713 kubelet[2905]: W0114 01:41:29.382656 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.383542 kubelet[2905]: E0114 01:41:29.383487 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.411957 systemd[1]: Started cri-containerd-2341be69217ea3d1190c6b275b6cd9546019fea07483145b135f5071288236bc.scope - libcontainer container 2341be69217ea3d1190c6b275b6cd9546019fea07483145b135f5071288236bc. Jan 14 01:41:29.422000 audit: BPF prog-id=156 op=LOAD Jan 14 01:41:29.423000 audit: BPF prog-id=157 op=LOAD Jan 14 01:41:29.423000 audit[3418]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=3401 pid=3418 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:29.423000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233343162653639323137656133643131393063366232373562366364 Jan 14 01:41:29.423000 audit: BPF prog-id=157 op=UNLOAD Jan 14 01:41:29.423000 audit[3418]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3401 pid=3418 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:29.423000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233343162653639323137656133643131393063366232373562366364 Jan 14 01:41:29.423000 audit: BPF prog-id=158 op=LOAD Jan 14 01:41:29.423000 audit[3418]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3401 pid=3418 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:29.423000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233343162653639323137656133643131393063366232373562366364 Jan 14 01:41:29.424000 audit: BPF prog-id=159 op=LOAD Jan 14 01:41:29.424000 audit[3418]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3401 pid=3418 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:29.424000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233343162653639323137656133643131393063366232373562366364 Jan 14 01:41:29.424000 audit: BPF prog-id=159 op=UNLOAD Jan 14 01:41:29.424000 audit[3418]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3401 pid=3418 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:29.424000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233343162653639323137656133643131393063366232373562366364 Jan 14 01:41:29.424000 audit: BPF prog-id=158 op=UNLOAD Jan 14 01:41:29.424000 audit[3418]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3401 pid=3418 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:29.424000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233343162653639323137656133643131393063366232373562366364 Jan 14 01:41:29.424000 audit: BPF prog-id=160 op=LOAD Jan 14 01:41:29.424000 audit[3418]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3401 pid=3418 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:29.424000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233343162653639323137656133643131393063366232373562366364 Jan 14 01:41:29.447673 containerd[1676]: time="2026-01-14T01:41:29.447634186Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7d48b46ff6-zr4fg,Uid:c5e8667e-c313-4435-a217-4843ea2638dd,Namespace:calico-system,Attempt:0,} returns sandbox id \"2341be69217ea3d1190c6b275b6cd9546019fea07483145b135f5071288236bc\"" Jan 14 01:41:29.449301 containerd[1676]: time="2026-01-14T01:41:29.449275310Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 14 01:41:29.451743 kubelet[2905]: E0114 01:41:29.451706 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.451743 kubelet[2905]: W0114 01:41:29.451741 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.451922 kubelet[2905]: E0114 01:41:29.451759 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.451922 kubelet[2905]: E0114 01:41:29.452095 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.451922 kubelet[2905]: W0114 01:41:29.452106 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.451922 kubelet[2905]: E0114 01:41:29.452145 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.452394 kubelet[2905]: E0114 01:41:29.452292 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.452394 kubelet[2905]: W0114 01:41:29.452300 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.452394 kubelet[2905]: E0114 01:41:29.452309 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.452450 kubelet[2905]: E0114 01:41:29.452437 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.452450 kubelet[2905]: W0114 01:41:29.452445 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.452490 kubelet[2905]: E0114 01:41:29.452453 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.452617 kubelet[2905]: E0114 01:41:29.452605 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.452655 kubelet[2905]: W0114 01:41:29.452631 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.452655 kubelet[2905]: E0114 01:41:29.452641 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.452838 kubelet[2905]: E0114 01:41:29.452827 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.452838 kubelet[2905]: W0114 01:41:29.452839 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.452907 kubelet[2905]: E0114 01:41:29.452848 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.453039 kubelet[2905]: E0114 01:41:29.453028 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.453070 kubelet[2905]: W0114 01:41:29.453039 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.453070 kubelet[2905]: E0114 01:41:29.453048 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.453203 kubelet[2905]: E0114 01:41:29.453193 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.453203 kubelet[2905]: W0114 01:41:29.453203 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.453259 kubelet[2905]: E0114 01:41:29.453211 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.453357 kubelet[2905]: E0114 01:41:29.453346 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.453388 kubelet[2905]: W0114 01:41:29.453369 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.453388 kubelet[2905]: E0114 01:41:29.453378 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.454041 kubelet[2905]: E0114 01:41:29.453981 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.454041 kubelet[2905]: W0114 01:41:29.454001 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.454145 kubelet[2905]: E0114 01:41:29.454101 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.454333 kubelet[2905]: E0114 01:41:29.454305 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.454333 kubelet[2905]: W0114 01:41:29.454320 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.454333 kubelet[2905]: E0114 01:41:29.454329 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.454488 kubelet[2905]: E0114 01:41:29.454477 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.454524 kubelet[2905]: W0114 01:41:29.454489 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.454524 kubelet[2905]: E0114 01:41:29.454501 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.454685 kubelet[2905]: E0114 01:41:29.454674 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.454685 kubelet[2905]: W0114 01:41:29.454685 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.454763 kubelet[2905]: E0114 01:41:29.454693 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.454875 kubelet[2905]: E0114 01:41:29.454863 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.454875 kubelet[2905]: W0114 01:41:29.454874 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.454980 kubelet[2905]: E0114 01:41:29.454882 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.455008 kubelet[2905]: E0114 01:41:29.455004 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.455028 kubelet[2905]: W0114 01:41:29.455010 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.455028 kubelet[2905]: E0114 01:41:29.455017 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.455154 kubelet[2905]: E0114 01:41:29.455142 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.455154 kubelet[2905]: W0114 01:41:29.455153 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.455210 kubelet[2905]: E0114 01:41:29.455164 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.455302 kubelet[2905]: E0114 01:41:29.455292 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.455302 kubelet[2905]: W0114 01:41:29.455302 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.455355 kubelet[2905]: E0114 01:41:29.455309 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.455430 kubelet[2905]: E0114 01:41:29.455420 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.455430 kubelet[2905]: W0114 01:41:29.455430 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.455490 kubelet[2905]: E0114 01:41:29.455437 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.455557 kubelet[2905]: E0114 01:41:29.455547 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.455557 kubelet[2905]: W0114 01:41:29.455556 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.455606 kubelet[2905]: E0114 01:41:29.455563 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.455684 kubelet[2905]: E0114 01:41:29.455673 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.455684 kubelet[2905]: W0114 01:41:29.455683 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.455754 kubelet[2905]: E0114 01:41:29.455690 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.460303 kubelet[2905]: E0114 01:41:29.460193 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.460303 kubelet[2905]: W0114 01:41:29.460229 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.460303 kubelet[2905]: E0114 01:41:29.460260 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.460454 kubelet[2905]: I0114 01:41:29.460438 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b8220b4-811d-4471-95d7-cea88df93438-kubelet-dir\") pod \"csi-node-driver-twlzn\" (UID: \"1b8220b4-811d-4471-95d7-cea88df93438\") " pod="calico-system/csi-node-driver-twlzn" Jan 14 01:41:29.460701 kubelet[2905]: E0114 01:41:29.460663 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.460778 kubelet[2905]: W0114 01:41:29.460765 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.460846 kubelet[2905]: E0114 01:41:29.460834 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.461099 kubelet[2905]: E0114 01:41:29.461084 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.461344 kubelet[2905]: W0114 01:41:29.461161 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.461344 kubelet[2905]: E0114 01:41:29.461177 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.461679 kubelet[2905]: E0114 01:41:29.461663 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.461771 kubelet[2905]: W0114 01:41:29.461759 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.461844 kubelet[2905]: E0114 01:41:29.461825 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.461913 kubelet[2905]: I0114 01:41:29.461901 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1b8220b4-811d-4471-95d7-cea88df93438-socket-dir\") pod \"csi-node-driver-twlzn\" (UID: \"1b8220b4-811d-4471-95d7-cea88df93438\") " pod="calico-system/csi-node-driver-twlzn" Jan 14 01:41:29.462138 kubelet[2905]: E0114 01:41:29.462123 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.462217 kubelet[2905]: W0114 01:41:29.462203 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.462282 kubelet[2905]: E0114 01:41:29.462261 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.462594 kubelet[2905]: E0114 01:41:29.462485 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.462594 kubelet[2905]: W0114 01:41:29.462498 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.462594 kubelet[2905]: E0114 01:41:29.462508 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.462751 kubelet[2905]: E0114 01:41:29.462738 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.462822 kubelet[2905]: W0114 01:41:29.462809 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.463011 kubelet[2905]: E0114 01:41:29.462867 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.463011 kubelet[2905]: I0114 01:41:29.462900 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz2bm\" (UniqueName: \"kubernetes.io/projected/1b8220b4-811d-4471-95d7-cea88df93438-kube-api-access-bz2bm\") pod \"csi-node-driver-twlzn\" (UID: \"1b8220b4-811d-4471-95d7-cea88df93438\") " pod="calico-system/csi-node-driver-twlzn" Jan 14 01:41:29.463141 kubelet[2905]: E0114 01:41:29.463127 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.463197 kubelet[2905]: W0114 01:41:29.463186 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.463244 kubelet[2905]: E0114 01:41:29.463234 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.463318 kubelet[2905]: I0114 01:41:29.463306 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1b8220b4-811d-4471-95d7-cea88df93438-registration-dir\") pod \"csi-node-driver-twlzn\" (UID: \"1b8220b4-811d-4471-95d7-cea88df93438\") " pod="calico-system/csi-node-driver-twlzn" Jan 14 01:41:29.463550 kubelet[2905]: E0114 01:41:29.463535 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.463612 kubelet[2905]: W0114 01:41:29.463600 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.463670 kubelet[2905]: E0114 01:41:29.463659 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.464054 kubelet[2905]: E0114 01:41:29.463936 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.464054 kubelet[2905]: W0114 01:41:29.463950 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.464054 kubelet[2905]: E0114 01:41:29.463960 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.464228 kubelet[2905]: E0114 01:41:29.464215 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.464282 kubelet[2905]: W0114 01:41:29.464271 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.464336 kubelet[2905]: E0114 01:41:29.464325 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.464555 kubelet[2905]: E0114 01:41:29.464542 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.464636 kubelet[2905]: W0114 01:41:29.464608 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.464636 kubelet[2905]: E0114 01:41:29.464623 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.464949 kubelet[2905]: E0114 01:41:29.464911 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.464949 kubelet[2905]: W0114 01:41:29.464925 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.464949 kubelet[2905]: E0114 01:41:29.464934 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.465117 kubelet[2905]: I0114 01:41:29.465065 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/1b8220b4-811d-4471-95d7-cea88df93438-varrun\") pod \"csi-node-driver-twlzn\" (UID: \"1b8220b4-811d-4471-95d7-cea88df93438\") " pod="calico-system/csi-node-driver-twlzn" Jan 14 01:41:29.465408 kubelet[2905]: E0114 01:41:29.465314 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.465408 kubelet[2905]: W0114 01:41:29.465328 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.465408 kubelet[2905]: E0114 01:41:29.465338 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.465568 kubelet[2905]: E0114 01:41:29.465556 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.465618 kubelet[2905]: W0114 01:41:29.465608 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.465674 kubelet[2905]: E0114 01:41:29.465659 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.518609 containerd[1676]: time="2026-01-14T01:41:29.518512564Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-r5qs2,Uid:fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a,Namespace:calico-system,Attempt:0,}" Jan 14 01:41:29.548637 containerd[1676]: time="2026-01-14T01:41:29.548578559Z" level=info msg="connecting to shim 994c2c014c1c08403335b6ee806ffeba314de50a59edb63eb1475fdbe621bfd1" address="unix:///run/containerd/s/a9194509a4f24f78b4c5c9cb0646a82440dc2a2cea687db7dffd5b7d013ea7d5" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:41:29.565772 kubelet[2905]: E0114 01:41:29.565736 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.565772 kubelet[2905]: W0114 01:41:29.565758 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.565772 kubelet[2905]: E0114 01:41:29.565775 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.566130 kubelet[2905]: E0114 01:41:29.566024 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.566130 kubelet[2905]: W0114 01:41:29.566032 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.566130 kubelet[2905]: E0114 01:41:29.566040 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.566269 kubelet[2905]: E0114 01:41:29.566251 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.566269 kubelet[2905]: W0114 01:41:29.566264 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.566324 kubelet[2905]: E0114 01:41:29.566273 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.566468 kubelet[2905]: E0114 01:41:29.566452 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.566468 kubelet[2905]: W0114 01:41:29.566465 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.566528 kubelet[2905]: E0114 01:41:29.566475 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.566687 kubelet[2905]: E0114 01:41:29.566672 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.566687 kubelet[2905]: W0114 01:41:29.566685 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.566779 kubelet[2905]: E0114 01:41:29.566693 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.567585 kubelet[2905]: E0114 01:41:29.567236 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.567585 kubelet[2905]: W0114 01:41:29.567260 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.567585 kubelet[2905]: E0114 01:41:29.567285 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.569977 systemd[1]: Started cri-containerd-994c2c014c1c08403335b6ee806ffeba314de50a59edb63eb1475fdbe621bfd1.scope - libcontainer container 994c2c014c1c08403335b6ee806ffeba314de50a59edb63eb1475fdbe621bfd1. Jan 14 01:41:29.570370 kubelet[2905]: E0114 01:41:29.570338 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.570370 kubelet[2905]: W0114 01:41:29.570355 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.570370 kubelet[2905]: E0114 01:41:29.570370 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.570775 kubelet[2905]: E0114 01:41:29.570753 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.570867 kubelet[2905]: W0114 01:41:29.570852 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.571024 kubelet[2905]: E0114 01:41:29.570911 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.571886 kubelet[2905]: E0114 01:41:29.571868 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.571965 kubelet[2905]: W0114 01:41:29.571950 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.572164 kubelet[2905]: E0114 01:41:29.572147 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.572821 kubelet[2905]: E0114 01:41:29.572803 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.573026 kubelet[2905]: W0114 01:41:29.572883 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.573026 kubelet[2905]: E0114 01:41:29.572902 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.573316 kubelet[2905]: E0114 01:41:29.573293 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.573406 kubelet[2905]: W0114 01:41:29.573392 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.573478 kubelet[2905]: E0114 01:41:29.573466 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.573738 kubelet[2905]: E0114 01:41:29.573714 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.573813 kubelet[2905]: W0114 01:41:29.573802 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.573875 kubelet[2905]: E0114 01:41:29.573855 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.574236 kubelet[2905]: E0114 01:41:29.574190 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.574360 kubelet[2905]: W0114 01:41:29.574324 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.574450 kubelet[2905]: E0114 01:41:29.574438 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.574734 kubelet[2905]: E0114 01:41:29.574703 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.574810 kubelet[2905]: W0114 01:41:29.574790 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.574964 kubelet[2905]: E0114 01:41:29.574857 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.575086 kubelet[2905]: E0114 01:41:29.575073 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.575191 kubelet[2905]: W0114 01:41:29.575156 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.575340 kubelet[2905]: E0114 01:41:29.575243 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.575455 kubelet[2905]: E0114 01:41:29.575442 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.575674 kubelet[2905]: W0114 01:41:29.575490 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.575674 kubelet[2905]: E0114 01:41:29.575502 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.575852 kubelet[2905]: E0114 01:41:29.575823 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.576015 kubelet[2905]: W0114 01:41:29.575913 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.576015 kubelet[2905]: E0114 01:41:29.575947 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.576274 kubelet[2905]: E0114 01:41:29.576260 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.576344 kubelet[2905]: W0114 01:41:29.576333 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.576408 kubelet[2905]: E0114 01:41:29.576384 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.576664 kubelet[2905]: E0114 01:41:29.576640 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.576775 kubelet[2905]: W0114 01:41:29.576761 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.576963 kubelet[2905]: E0114 01:41:29.576815 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.577072 kubelet[2905]: E0114 01:41:29.577058 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.577125 kubelet[2905]: W0114 01:41:29.577115 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.577173 kubelet[2905]: E0114 01:41:29.577164 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.577384 kubelet[2905]: E0114 01:41:29.577371 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.577602 kubelet[2905]: W0114 01:41:29.577466 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.577602 kubelet[2905]: E0114 01:41:29.577485 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.577761 kubelet[2905]: E0114 01:41:29.577746 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.577825 kubelet[2905]: W0114 01:41:29.577814 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.577875 kubelet[2905]: E0114 01:41:29.577865 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.578236 kubelet[2905]: E0114 01:41:29.578091 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.578236 kubelet[2905]: W0114 01:41:29.578103 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.578236 kubelet[2905]: E0114 01:41:29.578113 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.578592 kubelet[2905]: E0114 01:41:29.578442 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.578592 kubelet[2905]: W0114 01:41:29.578456 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.578592 kubelet[2905]: E0114 01:41:29.578467 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.578795 kubelet[2905]: E0114 01:41:29.578774 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.578866 kubelet[2905]: W0114 01:41:29.578853 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.579202 kubelet[2905]: E0114 01:41:29.578941 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.579000 audit: BPF prog-id=161 op=LOAD Jan 14 01:41:29.579000 audit: BPF prog-id=162 op=LOAD Jan 14 01:41:29.579000 audit[3503]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3491 pid=3503 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:29.579000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939346332633031346331633038343033333335623665653830366666 Jan 14 01:41:29.579000 audit: BPF prog-id=162 op=UNLOAD Jan 14 01:41:29.579000 audit[3503]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=3503 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:29.579000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939346332633031346331633038343033333335623665653830366666 Jan 14 01:41:29.579000 audit: BPF prog-id=163 op=LOAD Jan 14 01:41:29.579000 audit[3503]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3491 pid=3503 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:29.579000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939346332633031346331633038343033333335623665653830366666 Jan 14 01:41:29.579000 audit: BPF prog-id=164 op=LOAD Jan 14 01:41:29.579000 audit[3503]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3491 pid=3503 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:29.579000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939346332633031346331633038343033333335623665653830366666 Jan 14 01:41:29.579000 audit: BPF prog-id=164 op=UNLOAD Jan 14 01:41:29.579000 audit[3503]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=3503 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:29.579000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939346332633031346331633038343033333335623665653830366666 Jan 14 01:41:29.579000 audit: BPF prog-id=163 op=UNLOAD Jan 14 01:41:29.579000 audit[3503]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=3503 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:29.579000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939346332633031346331633038343033333335623665653830366666 Jan 14 01:41:29.579000 audit: BPF prog-id=165 op=LOAD Jan 14 01:41:29.579000 audit[3503]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3491 pid=3503 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:29.579000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939346332633031346331633038343033333335623665653830366666 Jan 14 01:41:29.591588 kubelet[2905]: E0114 01:41:29.591509 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:29.591588 kubelet[2905]: W0114 01:41:29.591531 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:29.591588 kubelet[2905]: E0114 01:41:29.591548 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:29.596933 containerd[1676]: time="2026-01-14T01:41:29.596896480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-r5qs2,Uid:fb72bc86-858b-44eb-9dc8-ae8e58ac4d3a,Namespace:calico-system,Attempt:0,} returns sandbox id \"994c2c014c1c08403335b6ee806ffeba314de50a59edb63eb1475fdbe621bfd1\"" Jan 14 01:41:30.032000 audit[3555]: NETFILTER_CFG table=filter:117 family=2 entries=22 op=nft_register_rule pid=3555 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:30.032000 audit[3555]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=fffff0bca300 a2=0 a3=1 items=0 ppid=3072 pid=3555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:30.032000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:30.043000 audit[3555]: NETFILTER_CFG table=nat:118 family=2 entries=12 op=nft_register_rule pid=3555 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:30.043000 audit[3555]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff0bca300 a2=0 a3=1 items=0 ppid=3072 pid=3555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:30.043000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:30.903998 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2508278129.mount: Deactivated successfully. Jan 14 01:41:31.229679 kubelet[2905]: E0114 01:41:31.229526 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:41:31.793640 containerd[1676]: time="2026-01-14T01:41:31.793579591Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:41:31.795387 containerd[1676]: time="2026-01-14T01:41:31.795338395Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=31716861" Jan 14 01:41:31.796439 containerd[1676]: time="2026-01-14T01:41:31.796410838Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:41:31.799032 containerd[1676]: time="2026-01-14T01:41:31.798999244Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:41:31.799526 containerd[1676]: time="2026-01-14T01:41:31.799500646Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 2.350088936s" Jan 14 01:41:31.799758 containerd[1676]: time="2026-01-14T01:41:31.799531926Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Jan 14 01:41:31.801100 containerd[1676]: time="2026-01-14T01:41:31.801061890Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 14 01:41:31.818319 containerd[1676]: time="2026-01-14T01:41:31.818123052Z" level=info msg="CreateContainer within sandbox \"2341be69217ea3d1190c6b275b6cd9546019fea07483145b135f5071288236bc\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 14 01:41:31.826579 containerd[1676]: time="2026-01-14T01:41:31.826345993Z" level=info msg="Container a29648cc31a1527f2a839a9ebe1607325d765b111cfe61fb73704bc6cfa98ba0: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:41:31.837350 containerd[1676]: time="2026-01-14T01:41:31.837201180Z" level=info msg="CreateContainer within sandbox \"2341be69217ea3d1190c6b275b6cd9546019fea07483145b135f5071288236bc\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"a29648cc31a1527f2a839a9ebe1607325d765b111cfe61fb73704bc6cfa98ba0\"" Jan 14 01:41:31.838150 containerd[1676]: time="2026-01-14T01:41:31.837903262Z" level=info msg="StartContainer for \"a29648cc31a1527f2a839a9ebe1607325d765b111cfe61fb73704bc6cfa98ba0\"" Jan 14 01:41:31.839764 containerd[1676]: time="2026-01-14T01:41:31.839522586Z" level=info msg="connecting to shim a29648cc31a1527f2a839a9ebe1607325d765b111cfe61fb73704bc6cfa98ba0" address="unix:///run/containerd/s/f271ac161b42d24d933f2d26f7dbf0da86de390d1c0d2cf9cfc5cba47a1742e0" protocol=ttrpc version=3 Jan 14 01:41:31.860189 systemd[1]: Started cri-containerd-a29648cc31a1527f2a839a9ebe1607325d765b111cfe61fb73704bc6cfa98ba0.scope - libcontainer container a29648cc31a1527f2a839a9ebe1607325d765b111cfe61fb73704bc6cfa98ba0. Jan 14 01:41:31.872000 audit: BPF prog-id=166 op=LOAD Jan 14 01:41:31.874840 kernel: kauditd_printk_skb: 64 callbacks suppressed Jan 14 01:41:31.874898 kernel: audit: type=1334 audit(1768354891.872:562): prog-id=166 op=LOAD Jan 14 01:41:31.872000 audit: BPF prog-id=167 op=LOAD Jan 14 01:41:31.876754 kernel: audit: type=1334 audit(1768354891.872:563): prog-id=167 op=LOAD Jan 14 01:41:31.876794 kernel: audit: type=1300 audit(1768354891.872:563): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3401 pid=3567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:31.872000 audit[3567]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3401 pid=3567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:31.872000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6132393634386363333161313532376632613833396139656265313630 Jan 14 01:41:31.884673 kernel: audit: type=1327 audit(1768354891.872:563): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6132393634386363333161313532376632613833396139656265313630 Jan 14 01:41:31.884774 kernel: audit: type=1334 audit(1768354891.873:564): prog-id=167 op=UNLOAD Jan 14 01:41:31.873000 audit: BPF prog-id=167 op=UNLOAD Jan 14 01:41:31.873000 audit[3567]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3401 pid=3567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:31.889566 kernel: audit: type=1300 audit(1768354891.873:564): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3401 pid=3567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:31.889658 kernel: audit: type=1327 audit(1768354891.873:564): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6132393634386363333161313532376632613833396139656265313630 Jan 14 01:41:31.873000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6132393634386363333161313532376632613833396139656265313630 Jan 14 01:41:31.873000 audit: BPF prog-id=168 op=LOAD Jan 14 01:41:31.894089 kernel: audit: type=1334 audit(1768354891.873:565): prog-id=168 op=LOAD Jan 14 01:41:31.894139 kernel: audit: type=1300 audit(1768354891.873:565): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3401 pid=3567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:31.873000 audit[3567]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3401 pid=3567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:31.873000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6132393634386363333161313532376632613833396139656265313630 Jan 14 01:41:31.900579 kernel: audit: type=1327 audit(1768354891.873:565): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6132393634386363333161313532376632613833396139656265313630 Jan 14 01:41:31.874000 audit: BPF prog-id=169 op=LOAD Jan 14 01:41:31.874000 audit[3567]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3401 pid=3567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:31.874000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6132393634386363333161313532376632613833396139656265313630 Jan 14 01:41:31.879000 audit: BPF prog-id=169 op=UNLOAD Jan 14 01:41:31.879000 audit[3567]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3401 pid=3567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:31.879000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6132393634386363333161313532376632613833396139656265313630 Jan 14 01:41:31.879000 audit: BPF prog-id=168 op=UNLOAD Jan 14 01:41:31.879000 audit[3567]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3401 pid=3567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:31.879000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6132393634386363333161313532376632613833396139656265313630 Jan 14 01:41:31.879000 audit: BPF prog-id=170 op=LOAD Jan 14 01:41:31.879000 audit[3567]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3401 pid=3567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:31.879000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6132393634386363333161313532376632613833396139656265313630 Jan 14 01:41:31.917948 containerd[1676]: time="2026-01-14T01:41:31.917911703Z" level=info msg="StartContainer for \"a29648cc31a1527f2a839a9ebe1607325d765b111cfe61fb73704bc6cfa98ba0\" returns successfully" Jan 14 01:41:32.376444 kubelet[2905]: E0114 01:41:32.376404 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.376444 kubelet[2905]: W0114 01:41:32.376429 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.376444 kubelet[2905]: E0114 01:41:32.376451 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.376862 kubelet[2905]: E0114 01:41:32.376605 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.376862 kubelet[2905]: W0114 01:41:32.376614 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.376862 kubelet[2905]: E0114 01:41:32.376658 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.376862 kubelet[2905]: E0114 01:41:32.376818 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.376862 kubelet[2905]: W0114 01:41:32.376826 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.376862 kubelet[2905]: E0114 01:41:32.376834 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.376998 kubelet[2905]: E0114 01:41:32.376975 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.376998 kubelet[2905]: W0114 01:41:32.376988 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.376998 kubelet[2905]: E0114 01:41:32.376997 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.377190 kubelet[2905]: E0114 01:41:32.377156 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.377190 kubelet[2905]: W0114 01:41:32.377170 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.377190 kubelet[2905]: E0114 01:41:32.377179 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.377349 kubelet[2905]: E0114 01:41:32.377326 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.377349 kubelet[2905]: W0114 01:41:32.377338 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.377349 kubelet[2905]: E0114 01:41:32.377346 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.377491 kubelet[2905]: E0114 01:41:32.377479 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.377491 kubelet[2905]: W0114 01:41:32.377490 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.377535 kubelet[2905]: E0114 01:41:32.377498 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.377660 kubelet[2905]: E0114 01:41:32.377637 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.377660 kubelet[2905]: W0114 01:41:32.377653 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.377702 kubelet[2905]: E0114 01:41:32.377662 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.377833 kubelet[2905]: E0114 01:41:32.377820 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.377833 kubelet[2905]: W0114 01:41:32.377831 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.377883 kubelet[2905]: E0114 01:41:32.377840 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.377977 kubelet[2905]: E0114 01:41:32.377965 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.377977 kubelet[2905]: W0114 01:41:32.377975 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.378015 kubelet[2905]: E0114 01:41:32.377983 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.378120 kubelet[2905]: E0114 01:41:32.378109 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.378141 kubelet[2905]: W0114 01:41:32.378120 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.378141 kubelet[2905]: E0114 01:41:32.378128 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.378263 kubelet[2905]: E0114 01:41:32.378253 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.378284 kubelet[2905]: W0114 01:41:32.378264 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.378284 kubelet[2905]: E0114 01:41:32.378272 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.378434 kubelet[2905]: E0114 01:41:32.378423 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.378457 kubelet[2905]: W0114 01:41:32.378434 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.378457 kubelet[2905]: E0114 01:41:32.378442 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.378593 kubelet[2905]: E0114 01:41:32.378575 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.378593 kubelet[2905]: W0114 01:41:32.378586 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.378593 kubelet[2905]: E0114 01:41:32.378593 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.378752 kubelet[2905]: E0114 01:41:32.378740 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.378752 kubelet[2905]: W0114 01:41:32.378750 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.378811 kubelet[2905]: E0114 01:41:32.378758 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.387409 kubelet[2905]: E0114 01:41:32.387380 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.387409 kubelet[2905]: W0114 01:41:32.387399 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.387409 kubelet[2905]: E0114 01:41:32.387412 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.387922 kubelet[2905]: E0114 01:41:32.387652 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.387922 kubelet[2905]: W0114 01:41:32.387664 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.387922 kubelet[2905]: E0114 01:41:32.387674 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.388023 kubelet[2905]: E0114 01:41:32.388000 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.388174 kubelet[2905]: W0114 01:41:32.388011 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.388395 kubelet[2905]: E0114 01:41:32.388139 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.388574 kubelet[2905]: E0114 01:41:32.388548 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.388574 kubelet[2905]: W0114 01:41:32.388563 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.388574 kubelet[2905]: E0114 01:41:32.388574 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.388758 kubelet[2905]: E0114 01:41:32.388735 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.388758 kubelet[2905]: W0114 01:41:32.388747 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.388758 kubelet[2905]: E0114 01:41:32.388756 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.388935 kubelet[2905]: E0114 01:41:32.388923 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.388935 kubelet[2905]: W0114 01:41:32.388933 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.388981 kubelet[2905]: E0114 01:41:32.388941 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.389105 kubelet[2905]: E0114 01:41:32.389093 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.389105 kubelet[2905]: W0114 01:41:32.389104 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.389154 kubelet[2905]: E0114 01:41:32.389112 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.389306 kubelet[2905]: E0114 01:41:32.389292 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.389306 kubelet[2905]: W0114 01:41:32.389303 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.389369 kubelet[2905]: E0114 01:41:32.389311 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.389528 kubelet[2905]: E0114 01:41:32.389508 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.389528 kubelet[2905]: W0114 01:41:32.389527 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.389606 kubelet[2905]: E0114 01:41:32.389540 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.389674 kubelet[2905]: E0114 01:41:32.389662 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.389674 kubelet[2905]: W0114 01:41:32.389672 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.389730 kubelet[2905]: E0114 01:41:32.389679 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.389859 kubelet[2905]: E0114 01:41:32.389847 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.389859 kubelet[2905]: W0114 01:41:32.389858 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.389921 kubelet[2905]: E0114 01:41:32.389869 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.390142 kubelet[2905]: E0114 01:41:32.390127 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.390202 kubelet[2905]: W0114 01:41:32.390190 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.390257 kubelet[2905]: E0114 01:41:32.390246 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.390496 kubelet[2905]: E0114 01:41:32.390481 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.390559 kubelet[2905]: W0114 01:41:32.390549 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.390619 kubelet[2905]: E0114 01:41:32.390608 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.390924 kubelet[2905]: E0114 01:41:32.390822 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.390924 kubelet[2905]: W0114 01:41:32.390834 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.390924 kubelet[2905]: E0114 01:41:32.390845 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.391077 kubelet[2905]: E0114 01:41:32.391063 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.391129 kubelet[2905]: W0114 01:41:32.391119 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.391187 kubelet[2905]: E0114 01:41:32.391177 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.391448 kubelet[2905]: E0114 01:41:32.391434 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.391518 kubelet[2905]: W0114 01:41:32.391507 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.391570 kubelet[2905]: E0114 01:41:32.391561 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.391804 kubelet[2905]: E0114 01:41:32.391790 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.391903 kubelet[2905]: W0114 01:41:32.391889 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.391965 kubelet[2905]: E0114 01:41:32.391953 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:32.392221 kubelet[2905]: E0114 01:41:32.392178 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:32.392221 kubelet[2905]: W0114 01:41:32.392190 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:32.392221 kubelet[2905]: E0114 01:41:32.392199 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.229142 kubelet[2905]: E0114 01:41:33.229018 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:41:33.306023 kubelet[2905]: I0114 01:41:33.305973 2905 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 14 01:41:33.310852 containerd[1676]: time="2026-01-14T01:41:33.310806957Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:41:33.312304 containerd[1676]: time="2026-01-14T01:41:33.312258881Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 14 01:41:33.313456 containerd[1676]: time="2026-01-14T01:41:33.313407003Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:41:33.316291 containerd[1676]: time="2026-01-14T01:41:33.316242531Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:41:33.316849 containerd[1676]: time="2026-01-14T01:41:33.316816492Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.515709002s" Jan 14 01:41:33.316882 containerd[1676]: time="2026-01-14T01:41:33.316855772Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Jan 14 01:41:33.320794 containerd[1676]: time="2026-01-14T01:41:33.320759822Z" level=info msg="CreateContainer within sandbox \"994c2c014c1c08403335b6ee806ffeba314de50a59edb63eb1475fdbe621bfd1\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 14 01:41:33.333928 containerd[1676]: time="2026-01-14T01:41:33.333885855Z" level=info msg="Container 6c16eaf05e2f32f0c6774f970e6f3cddfbd39f4292406bf4ceb13e7d358d7d5a: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:41:33.345045 containerd[1676]: time="2026-01-14T01:41:33.345005083Z" level=info msg="CreateContainer within sandbox \"994c2c014c1c08403335b6ee806ffeba314de50a59edb63eb1475fdbe621bfd1\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"6c16eaf05e2f32f0c6774f970e6f3cddfbd39f4292406bf4ceb13e7d358d7d5a\"" Jan 14 01:41:33.346060 containerd[1676]: time="2026-01-14T01:41:33.345944645Z" level=info msg="StartContainer for \"6c16eaf05e2f32f0c6774f970e6f3cddfbd39f4292406bf4ceb13e7d358d7d5a\"" Jan 14 01:41:33.348513 containerd[1676]: time="2026-01-14T01:41:33.348358451Z" level=info msg="connecting to shim 6c16eaf05e2f32f0c6774f970e6f3cddfbd39f4292406bf4ceb13e7d358d7d5a" address="unix:///run/containerd/s/a9194509a4f24f78b4c5c9cb0646a82440dc2a2cea687db7dffd5b7d013ea7d5" protocol=ttrpc version=3 Jan 14 01:41:33.376093 systemd[1]: Started cri-containerd-6c16eaf05e2f32f0c6774f970e6f3cddfbd39f4292406bf4ceb13e7d358d7d5a.scope - libcontainer container 6c16eaf05e2f32f0c6774f970e6f3cddfbd39f4292406bf4ceb13e7d358d7d5a. Jan 14 01:41:33.384425 kubelet[2905]: E0114 01:41:33.384382 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.384425 kubelet[2905]: W0114 01:41:33.384418 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.384941 kubelet[2905]: E0114 01:41:33.384442 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.385077 kubelet[2905]: E0114 01:41:33.385047 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.385077 kubelet[2905]: W0114 01:41:33.385065 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.385142 kubelet[2905]: E0114 01:41:33.385078 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.385259 kubelet[2905]: E0114 01:41:33.385246 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.385259 kubelet[2905]: W0114 01:41:33.385257 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.385312 kubelet[2905]: E0114 01:41:33.385266 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.385789 kubelet[2905]: E0114 01:41:33.385665 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.385843 kubelet[2905]: W0114 01:41:33.385801 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.385843 kubelet[2905]: E0114 01:41:33.385817 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.386229 kubelet[2905]: E0114 01:41:33.386158 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.386272 kubelet[2905]: W0114 01:41:33.386228 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.386272 kubelet[2905]: E0114 01:41:33.386245 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.386682 kubelet[2905]: E0114 01:41:33.386619 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.386682 kubelet[2905]: W0114 01:41:33.386685 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.386776 kubelet[2905]: E0114 01:41:33.386697 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.387198 kubelet[2905]: E0114 01:41:33.387153 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.387546 kubelet[2905]: W0114 01:41:33.387527 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.387592 kubelet[2905]: E0114 01:41:33.387550 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.388225 kubelet[2905]: E0114 01:41:33.388204 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.388225 kubelet[2905]: W0114 01:41:33.388225 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.388297 kubelet[2905]: E0114 01:41:33.388236 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.388425 kubelet[2905]: E0114 01:41:33.388413 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.388425 kubelet[2905]: W0114 01:41:33.388424 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.388474 kubelet[2905]: E0114 01:41:33.388432 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.388561 kubelet[2905]: E0114 01:41:33.388551 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.388561 kubelet[2905]: W0114 01:41:33.388560 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.388612 kubelet[2905]: E0114 01:41:33.388568 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.388695 kubelet[2905]: E0114 01:41:33.388685 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.388866 kubelet[2905]: W0114 01:41:33.388694 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.388866 kubelet[2905]: E0114 01:41:33.388701 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.388866 kubelet[2905]: E0114 01:41:33.388867 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.388956 kubelet[2905]: W0114 01:41:33.388874 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.388956 kubelet[2905]: E0114 01:41:33.388882 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.389033 kubelet[2905]: E0114 01:41:33.389012 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.389033 kubelet[2905]: W0114 01:41:33.389022 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.389033 kubelet[2905]: E0114 01:41:33.389029 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.389162 kubelet[2905]: E0114 01:41:33.389151 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.389162 kubelet[2905]: W0114 01:41:33.389161 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.389208 kubelet[2905]: E0114 01:41:33.389168 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.389291 kubelet[2905]: E0114 01:41:33.389281 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.389315 kubelet[2905]: W0114 01:41:33.389290 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.389315 kubelet[2905]: E0114 01:41:33.389297 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.395377 kubelet[2905]: E0114 01:41:33.395334 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.395448 kubelet[2905]: W0114 01:41:33.395373 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.395448 kubelet[2905]: E0114 01:41:33.395409 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.395691 kubelet[2905]: E0114 01:41:33.395666 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.395691 kubelet[2905]: W0114 01:41:33.395677 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.395691 kubelet[2905]: E0114 01:41:33.395685 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.395949 kubelet[2905]: E0114 01:41:33.395928 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.395949 kubelet[2905]: W0114 01:41:33.395945 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.396024 kubelet[2905]: E0114 01:41:33.395957 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.396121 kubelet[2905]: E0114 01:41:33.396108 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.396121 kubelet[2905]: W0114 01:41:33.396120 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.396195 kubelet[2905]: E0114 01:41:33.396129 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.396277 kubelet[2905]: E0114 01:41:33.396265 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.396277 kubelet[2905]: W0114 01:41:33.396275 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.396327 kubelet[2905]: E0114 01:41:33.396283 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.396443 kubelet[2905]: E0114 01:41:33.396433 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.396471 kubelet[2905]: W0114 01:41:33.396443 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.396471 kubelet[2905]: E0114 01:41:33.396451 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.396674 kubelet[2905]: E0114 01:41:33.396658 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.396674 kubelet[2905]: W0114 01:41:33.396672 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.396733 kubelet[2905]: E0114 01:41:33.396683 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.396885 kubelet[2905]: E0114 01:41:33.396871 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.396885 kubelet[2905]: W0114 01:41:33.396883 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.396990 kubelet[2905]: E0114 01:41:33.396891 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.397041 kubelet[2905]: E0114 01:41:33.397028 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.397041 kubelet[2905]: W0114 01:41:33.397038 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.397090 kubelet[2905]: E0114 01:41:33.397046 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.397180 kubelet[2905]: E0114 01:41:33.397168 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.397180 kubelet[2905]: W0114 01:41:33.397177 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.397247 kubelet[2905]: E0114 01:41:33.397184 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.397334 kubelet[2905]: E0114 01:41:33.397322 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.397334 kubelet[2905]: W0114 01:41:33.397332 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.397410 kubelet[2905]: E0114 01:41:33.397341 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.397469 kubelet[2905]: E0114 01:41:33.397458 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.397469 kubelet[2905]: W0114 01:41:33.397467 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.397517 kubelet[2905]: E0114 01:41:33.397474 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.397619 kubelet[2905]: E0114 01:41:33.397609 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.397649 kubelet[2905]: W0114 01:41:33.397619 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.397649 kubelet[2905]: E0114 01:41:33.397626 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.397878 kubelet[2905]: E0114 01:41:33.397862 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.397878 kubelet[2905]: W0114 01:41:33.397877 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.397940 kubelet[2905]: E0114 01:41:33.397887 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.398047 kubelet[2905]: E0114 01:41:33.398035 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.398047 kubelet[2905]: W0114 01:41:33.398045 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.398096 kubelet[2905]: E0114 01:41:33.398052 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.398222 kubelet[2905]: E0114 01:41:33.398211 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.398243 kubelet[2905]: W0114 01:41:33.398221 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.398243 kubelet[2905]: E0114 01:41:33.398230 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.398474 kubelet[2905]: E0114 01:41:33.398457 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.398474 kubelet[2905]: W0114 01:41:33.398473 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.398532 kubelet[2905]: E0114 01:41:33.398484 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.398656 kubelet[2905]: E0114 01:41:33.398644 2905 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 01:41:33.398656 kubelet[2905]: W0114 01:41:33.398654 2905 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 01:41:33.398707 kubelet[2905]: E0114 01:41:33.398662 2905 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 01:41:33.415000 audit: BPF prog-id=171 op=LOAD Jan 14 01:41:33.415000 audit[3645]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3491 pid=3645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:33.415000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663313665616630356532663332663063363737346639373065366633 Jan 14 01:41:33.415000 audit: BPF prog-id=172 op=LOAD Jan 14 01:41:33.415000 audit[3645]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3491 pid=3645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:33.415000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663313665616630356532663332663063363737346639373065366633 Jan 14 01:41:33.415000 audit: BPF prog-id=172 op=UNLOAD Jan 14 01:41:33.415000 audit[3645]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=3645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:33.415000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663313665616630356532663332663063363737346639373065366633 Jan 14 01:41:33.415000 audit: BPF prog-id=171 op=UNLOAD Jan 14 01:41:33.415000 audit[3645]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=3645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:33.415000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663313665616630356532663332663063363737346639373065366633 Jan 14 01:41:33.415000 audit: BPF prog-id=173 op=LOAD Jan 14 01:41:33.415000 audit[3645]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3491 pid=3645 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:33.415000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3663313665616630356532663332663063363737346639373065366633 Jan 14 01:41:33.437651 containerd[1676]: time="2026-01-14T01:41:33.436946033Z" level=info msg="StartContainer for \"6c16eaf05e2f32f0c6774f970e6f3cddfbd39f4292406bf4ceb13e7d358d7d5a\" returns successfully" Jan 14 01:41:33.446540 systemd[1]: cri-containerd-6c16eaf05e2f32f0c6774f970e6f3cddfbd39f4292406bf4ceb13e7d358d7d5a.scope: Deactivated successfully. Jan 14 01:41:33.450028 containerd[1676]: time="2026-01-14T01:41:33.449984266Z" level=info msg="received container exit event container_id:\"6c16eaf05e2f32f0c6774f970e6f3cddfbd39f4292406bf4ceb13e7d358d7d5a\" id:\"6c16eaf05e2f32f0c6774f970e6f3cddfbd39f4292406bf4ceb13e7d358d7d5a\" pid:3658 exited_at:{seconds:1768354893 nanos:449615665}" Jan 14 01:41:33.453000 audit: BPF prog-id=173 op=UNLOAD Jan 14 01:41:33.470082 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6c16eaf05e2f32f0c6774f970e6f3cddfbd39f4292406bf4ceb13e7d358d7d5a-rootfs.mount: Deactivated successfully. Jan 14 01:41:34.310755 containerd[1676]: time="2026-01-14T01:41:34.310685830Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 14 01:41:34.326641 kubelet[2905]: I0114 01:41:34.326265 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7d48b46ff6-zr4fg" podStartSLOduration=2.974280329 podStartE2EDuration="5.326250149s" podCreationTimestamp="2026-01-14 01:41:29 +0000 UTC" firstStartedPulling="2026-01-14 01:41:29.448902029 +0000 UTC m=+24.322388104" lastFinishedPulling="2026-01-14 01:41:31.800871889 +0000 UTC m=+26.674357924" observedRunningTime="2026-01-14 01:41:32.318858669 +0000 UTC m=+27.192344744" watchObservedRunningTime="2026-01-14 01:41:34.326250149 +0000 UTC m=+29.199736184" Jan 14 01:41:35.231033 kubelet[2905]: E0114 01:41:35.230931 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:41:37.229758 kubelet[2905]: E0114 01:41:37.229408 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:41:37.679889 containerd[1676]: time="2026-01-14T01:41:37.679842402Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:41:37.681077 containerd[1676]: time="2026-01-14T01:41:37.681036805Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65921248" Jan 14 01:41:37.688162 containerd[1676]: time="2026-01-14T01:41:37.687984822Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:41:37.691738 containerd[1676]: time="2026-01-14T01:41:37.691378791Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:41:37.692958 containerd[1676]: time="2026-01-14T01:41:37.692925155Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 3.382047845s" Jan 14 01:41:37.693065 containerd[1676]: time="2026-01-14T01:41:37.693049155Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Jan 14 01:41:37.699878 containerd[1676]: time="2026-01-14T01:41:37.699843892Z" level=info msg="CreateContainer within sandbox \"994c2c014c1c08403335b6ee806ffeba314de50a59edb63eb1475fdbe621bfd1\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 14 01:41:37.708264 containerd[1676]: time="2026-01-14T01:41:37.707575911Z" level=info msg="Container a5f6737436106981608045fc02c642108bc2d6a5fe719359b4c43a8b7c80fe0d: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:41:37.719900 containerd[1676]: time="2026-01-14T01:41:37.719860582Z" level=info msg="CreateContainer within sandbox \"994c2c014c1c08403335b6ee806ffeba314de50a59edb63eb1475fdbe621bfd1\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"a5f6737436106981608045fc02c642108bc2d6a5fe719359b4c43a8b7c80fe0d\"" Jan 14 01:41:37.720787 containerd[1676]: time="2026-01-14T01:41:37.720758465Z" level=info msg="StartContainer for \"a5f6737436106981608045fc02c642108bc2d6a5fe719359b4c43a8b7c80fe0d\"" Jan 14 01:41:37.722304 containerd[1676]: time="2026-01-14T01:41:37.722262108Z" level=info msg="connecting to shim a5f6737436106981608045fc02c642108bc2d6a5fe719359b4c43a8b7c80fe0d" address="unix:///run/containerd/s/a9194509a4f24f78b4c5c9cb0646a82440dc2a2cea687db7dffd5b7d013ea7d5" protocol=ttrpc version=3 Jan 14 01:41:37.742936 systemd[1]: Started cri-containerd-a5f6737436106981608045fc02c642108bc2d6a5fe719359b4c43a8b7c80fe0d.scope - libcontainer container a5f6737436106981608045fc02c642108bc2d6a5fe719359b4c43a8b7c80fe0d. Jan 14 01:41:37.799000 audit: BPF prog-id=174 op=LOAD Jan 14 01:41:37.802001 kernel: kauditd_printk_skb: 28 callbacks suppressed Jan 14 01:41:37.802059 kernel: audit: type=1334 audit(1768354897.799:576): prog-id=174 op=LOAD Jan 14 01:41:37.799000 audit[3738]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3491 pid=3738 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:37.806231 kernel: audit: type=1300 audit(1768354897.799:576): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3491 pid=3738 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:37.806341 kernel: audit: type=1327 audit(1768354897.799:576): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135663637333734333631303639383136303830343566633032633634 Jan 14 01:41:37.799000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135663637333734333631303639383136303830343566633032633634 Jan 14 01:41:37.799000 audit: BPF prog-id=175 op=LOAD Jan 14 01:41:37.810581 kernel: audit: type=1334 audit(1768354897.799:577): prog-id=175 op=LOAD Jan 14 01:41:37.799000 audit[3738]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3491 pid=3738 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:37.814244 kernel: audit: type=1300 audit(1768354897.799:577): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3491 pid=3738 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:37.814410 kernel: audit: type=1327 audit(1768354897.799:577): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135663637333734333631303639383136303830343566633032633634 Jan 14 01:41:37.799000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135663637333734333631303639383136303830343566633032633634 Jan 14 01:41:37.800000 audit: BPF prog-id=175 op=UNLOAD Jan 14 01:41:37.818796 kernel: audit: type=1334 audit(1768354897.800:578): prog-id=175 op=UNLOAD Jan 14 01:41:37.800000 audit[3738]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=3738 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:37.822994 kernel: audit: type=1300 audit(1768354897.800:578): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=3738 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:37.800000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135663637333734333631303639383136303830343566633032633634 Jan 14 01:41:37.826475 kernel: audit: type=1327 audit(1768354897.800:578): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135663637333734333631303639383136303830343566633032633634 Jan 14 01:41:37.826530 kernel: audit: type=1334 audit(1768354897.800:579): prog-id=174 op=UNLOAD Jan 14 01:41:37.800000 audit: BPF prog-id=174 op=UNLOAD Jan 14 01:41:37.800000 audit[3738]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=3738 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:37.800000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135663637333734333631303639383136303830343566633032633634 Jan 14 01:41:37.800000 audit: BPF prog-id=176 op=LOAD Jan 14 01:41:37.800000 audit[3738]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3491 pid=3738 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:37.800000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135663637333734333631303639383136303830343566633032633634 Jan 14 01:41:37.839371 containerd[1676]: time="2026-01-14T01:41:37.839256722Z" level=info msg="StartContainer for \"a5f6737436106981608045fc02c642108bc2d6a5fe719359b4c43a8b7c80fe0d\" returns successfully" Jan 14 01:41:39.093010 containerd[1676]: time="2026-01-14T01:41:39.092945026Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 14 01:41:39.095542 systemd[1]: cri-containerd-a5f6737436106981608045fc02c642108bc2d6a5fe719359b4c43a8b7c80fe0d.scope: Deactivated successfully. Jan 14 01:41:39.096442 systemd[1]: cri-containerd-a5f6737436106981608045fc02c642108bc2d6a5fe719359b4c43a8b7c80fe0d.scope: Consumed 484ms CPU time, 188.8M memory peak, 165.9M written to disk. Jan 14 01:41:39.096804 containerd[1676]: time="2026-01-14T01:41:39.096763916Z" level=info msg="received container exit event container_id:\"a5f6737436106981608045fc02c642108bc2d6a5fe719359b4c43a8b7c80fe0d\" id:\"a5f6737436106981608045fc02c642108bc2d6a5fe719359b4c43a8b7c80fe0d\" pid:3751 exited_at:{seconds:1768354899 nanos:96461315}" Jan 14 01:41:39.101000 audit: BPF prog-id=176 op=UNLOAD Jan 14 01:41:39.119736 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a5f6737436106981608045fc02c642108bc2d6a5fe719359b4c43a8b7c80fe0d-rootfs.mount: Deactivated successfully. Jan 14 01:41:39.191641 kubelet[2905]: I0114 01:41:39.191571 2905 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 14 01:41:40.530890 systemd[1]: Created slice kubepods-besteffort-pod1b8220b4_811d_4471_95d7_cea88df93438.slice - libcontainer container kubepods-besteffort-pod1b8220b4_811d_4471_95d7_cea88df93438.slice. Jan 14 01:41:40.537407 containerd[1676]: time="2026-01-14T01:41:40.537367849Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-twlzn,Uid:1b8220b4-811d-4471-95d7-cea88df93438,Namespace:calico-system,Attempt:0,}" Jan 14 01:41:40.540491 systemd[1]: Created slice kubepods-besteffort-pod8f23c871_1821_4e53_80e3_947513960a4b.slice - libcontainer container kubepods-besteffort-pod8f23c871_1821_4e53_80e3_947513960a4b.slice. Jan 14 01:41:40.547949 kubelet[2905]: I0114 01:41:40.541163 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7650f1f8-3708-460a-ab38-5c398404750b-config-volume\") pod \"coredns-674b8bbfcf-5gt6s\" (UID: \"7650f1f8-3708-460a-ab38-5c398404750b\") " pod="kube-system/coredns-674b8bbfcf-5gt6s" Jan 14 01:41:40.547949 kubelet[2905]: I0114 01:41:40.541192 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dzw9\" (UniqueName: \"kubernetes.io/projected/7650f1f8-3708-460a-ab38-5c398404750b-kube-api-access-6dzw9\") pod \"coredns-674b8bbfcf-5gt6s\" (UID: \"7650f1f8-3708-460a-ab38-5c398404750b\") " pod="kube-system/coredns-674b8bbfcf-5gt6s" Jan 14 01:41:40.547949 kubelet[2905]: I0114 01:41:40.541236 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f23c871-1821-4e53-80e3-947513960a4b-tigera-ca-bundle\") pod \"calico-kube-controllers-b67dd7dbc-k5lcl\" (UID: \"8f23c871-1821-4e53-80e3-947513960a4b\") " pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" Jan 14 01:41:40.547949 kubelet[2905]: I0114 01:41:40.541253 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf62l\" (UniqueName: \"kubernetes.io/projected/8f23c871-1821-4e53-80e3-947513960a4b-kube-api-access-pf62l\") pod \"calico-kube-controllers-b67dd7dbc-k5lcl\" (UID: \"8f23c871-1821-4e53-80e3-947513960a4b\") " pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" Jan 14 01:41:40.546304 systemd[1]: Created slice kubepods-burstable-pod7650f1f8_3708_460a_ab38_5c398404750b.slice - libcontainer container kubepods-burstable-pod7650f1f8_3708_460a_ab38_5c398404750b.slice. Jan 14 01:41:40.568860 systemd[1]: Created slice kubepods-burstable-pod4d0d03be_d2cc_4c75_ae9e_07c3631ee9eb.slice - libcontainer container kubepods-burstable-pod4d0d03be_d2cc_4c75_ae9e_07c3631ee9eb.slice. Jan 14 01:41:40.579466 systemd[1]: Created slice kubepods-besteffort-podadf9db04_ef07_4e4b_ac7b_0a044973dca8.slice - libcontainer container kubepods-besteffort-podadf9db04_ef07_4e4b_ac7b_0a044973dca8.slice. Jan 14 01:41:40.585430 systemd[1]: Created slice kubepods-besteffort-podac329ad1_edb3_4891_9a01_4d5e568d082e.slice - libcontainer container kubepods-besteffort-podac329ad1_edb3_4891_9a01_4d5e568d082e.slice. Jan 14 01:41:40.596110 systemd[1]: Created slice kubepods-besteffort-pode6b0fe41_664b_4db1_a6b0_8f9f42046133.slice - libcontainer container kubepods-besteffort-pode6b0fe41_664b_4db1_a6b0_8f9f42046133.slice. Jan 14 01:41:40.601627 systemd[1]: Created slice kubepods-besteffort-pode6d84f7a_c9f6_41d8_94ff_304c6e803e1e.slice - libcontainer container kubepods-besteffort-pode6d84f7a_c9f6_41d8_94ff_304c6e803e1e.slice. Jan 14 01:41:40.636241 containerd[1676]: time="2026-01-14T01:41:40.636188297Z" level=error msg="Failed to destroy network for sandbox \"cdb9fb5297f047bee21f21f5c88ce2c7b5a3b0e5eefb393e8e00dbd9f6063f23\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.637824 systemd[1]: run-netns-cni\x2d88bc838f\x2d8438\x2d28b7\x2d9b4e\x2dfab149ab4952.mount: Deactivated successfully. Jan 14 01:41:40.639917 containerd[1676]: time="2026-01-14T01:41:40.639865427Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-twlzn,Uid:1b8220b4-811d-4471-95d7-cea88df93438,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdb9fb5297f047bee21f21f5c88ce2c7b5a3b0e5eefb393e8e00dbd9f6063f23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.640341 kubelet[2905]: E0114 01:41:40.640295 2905 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdb9fb5297f047bee21f21f5c88ce2c7b5a3b0e5eefb393e8e00dbd9f6063f23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.640535 kubelet[2905]: E0114 01:41:40.640476 2905 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdb9fb5297f047bee21f21f5c88ce2c7b5a3b0e5eefb393e8e00dbd9f6063f23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-twlzn" Jan 14 01:41:40.640661 kubelet[2905]: E0114 01:41:40.640584 2905 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdb9fb5297f047bee21f21f5c88ce2c7b5a3b0e5eefb393e8e00dbd9f6063f23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-twlzn" Jan 14 01:41:40.640789 kubelet[2905]: E0114 01:41:40.640646 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-twlzn_calico-system(1b8220b4-811d-4471-95d7-cea88df93438)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-twlzn_calico-system(1b8220b4-811d-4471-95d7-cea88df93438)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cdb9fb5297f047bee21f21f5c88ce2c7b5a3b0e5eefb393e8e00dbd9f6063f23\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:41:40.641542 kubelet[2905]: I0114 01:41:40.641492 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6d84f7a-c9f6-41d8-94ff-304c6e803e1e-goldmane-ca-bundle\") pod \"goldmane-666569f655-jg7bs\" (UID: \"e6d84f7a-c9f6-41d8-94ff-304c6e803e1e\") " pod="calico-system/goldmane-666569f655-jg7bs" Jan 14 01:41:40.641542 kubelet[2905]: I0114 01:41:40.641533 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9blv8\" (UniqueName: \"kubernetes.io/projected/ac329ad1-edb3-4891-9a01-4d5e568d082e-kube-api-access-9blv8\") pod \"calico-apiserver-675898b8d4-fz6xg\" (UID: \"ac329ad1-edb3-4891-9a01-4d5e568d082e\") " pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" Jan 14 01:41:40.641542 kubelet[2905]: I0114 01:41:40.641550 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/adf9db04-ef07-4e4b-ac7b-0a044973dca8-calico-apiserver-certs\") pod \"calico-apiserver-675898b8d4-dphrr\" (UID: \"adf9db04-ef07-4e4b-ac7b-0a044973dca8\") " pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" Jan 14 01:41:40.641659 kubelet[2905]: I0114 01:41:40.641567 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e6b0fe41-664b-4db1-a6b0-8f9f42046133-whisker-backend-key-pair\") pod \"whisker-7f846496cf-nkwmn\" (UID: \"e6b0fe41-664b-4db1-a6b0-8f9f42046133\") " pod="calico-system/whisker-7f846496cf-nkwmn" Jan 14 01:41:40.641659 kubelet[2905]: I0114 01:41:40.641582 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6d84f7a-c9f6-41d8-94ff-304c6e803e1e-config\") pod \"goldmane-666569f655-jg7bs\" (UID: \"e6d84f7a-c9f6-41d8-94ff-304c6e803e1e\") " pod="calico-system/goldmane-666569f655-jg7bs" Jan 14 01:41:40.641659 kubelet[2905]: I0114 01:41:40.641598 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgnjt\" (UniqueName: \"kubernetes.io/projected/4d0d03be-d2cc-4c75-ae9e-07c3631ee9eb-kube-api-access-rgnjt\") pod \"coredns-674b8bbfcf-fk5c6\" (UID: \"4d0d03be-d2cc-4c75-ae9e-07c3631ee9eb\") " pod="kube-system/coredns-674b8bbfcf-fk5c6" Jan 14 01:41:40.641659 kubelet[2905]: I0114 01:41:40.641624 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6b0fe41-664b-4db1-a6b0-8f9f42046133-whisker-ca-bundle\") pod \"whisker-7f846496cf-nkwmn\" (UID: \"e6b0fe41-664b-4db1-a6b0-8f9f42046133\") " pod="calico-system/whisker-7f846496cf-nkwmn" Jan 14 01:41:40.641659 kubelet[2905]: I0114 01:41:40.641638 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5vfh\" (UniqueName: \"kubernetes.io/projected/e6b0fe41-664b-4db1-a6b0-8f9f42046133-kube-api-access-n5vfh\") pod \"whisker-7f846496cf-nkwmn\" (UID: \"e6b0fe41-664b-4db1-a6b0-8f9f42046133\") " pod="calico-system/whisker-7f846496cf-nkwmn" Jan 14 01:41:40.643942 kubelet[2905]: I0114 01:41:40.641653 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/e6d84f7a-c9f6-41d8-94ff-304c6e803e1e-goldmane-key-pair\") pod \"goldmane-666569f655-jg7bs\" (UID: \"e6d84f7a-c9f6-41d8-94ff-304c6e803e1e\") " pod="calico-system/goldmane-666569f655-jg7bs" Jan 14 01:41:40.643942 kubelet[2905]: I0114 01:41:40.641670 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjtvd\" (UniqueName: \"kubernetes.io/projected/adf9db04-ef07-4e4b-ac7b-0a044973dca8-kube-api-access-fjtvd\") pod \"calico-apiserver-675898b8d4-dphrr\" (UID: \"adf9db04-ef07-4e4b-ac7b-0a044973dca8\") " pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" Jan 14 01:41:40.643942 kubelet[2905]: I0114 01:41:40.641703 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d0d03be-d2cc-4c75-ae9e-07c3631ee9eb-config-volume\") pod \"coredns-674b8bbfcf-fk5c6\" (UID: \"4d0d03be-d2cc-4c75-ae9e-07c3631ee9eb\") " pod="kube-system/coredns-674b8bbfcf-fk5c6" Jan 14 01:41:40.643942 kubelet[2905]: I0114 01:41:40.641889 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk8sx\" (UniqueName: \"kubernetes.io/projected/e6d84f7a-c9f6-41d8-94ff-304c6e803e1e-kube-api-access-tk8sx\") pod \"goldmane-666569f655-jg7bs\" (UID: \"e6d84f7a-c9f6-41d8-94ff-304c6e803e1e\") " pod="calico-system/goldmane-666569f655-jg7bs" Jan 14 01:41:40.643942 kubelet[2905]: I0114 01:41:40.641915 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ac329ad1-edb3-4891-9a01-4d5e568d082e-calico-apiserver-certs\") pod \"calico-apiserver-675898b8d4-fz6xg\" (UID: \"ac329ad1-edb3-4891-9a01-4d5e568d082e\") " pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" Jan 14 01:41:40.849951 containerd[1676]: time="2026-01-14T01:41:40.849836753Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b67dd7dbc-k5lcl,Uid:8f23c871-1821-4e53-80e3-947513960a4b,Namespace:calico-system,Attempt:0,}" Jan 14 01:41:40.860766 containerd[1676]: time="2026-01-14T01:41:40.860699060Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5gt6s,Uid:7650f1f8-3708-460a-ab38-5c398404750b,Namespace:kube-system,Attempt:0,}" Jan 14 01:41:40.875660 containerd[1676]: time="2026-01-14T01:41:40.875615338Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-fk5c6,Uid:4d0d03be-d2cc-4c75-ae9e-07c3631ee9eb,Namespace:kube-system,Attempt:0,}" Jan 14 01:41:40.885009 containerd[1676]: time="2026-01-14T01:41:40.884965201Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-675898b8d4-dphrr,Uid:adf9db04-ef07-4e4b-ac7b-0a044973dca8,Namespace:calico-apiserver,Attempt:0,}" Jan 14 01:41:40.890971 containerd[1676]: time="2026-01-14T01:41:40.890926816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-675898b8d4-fz6xg,Uid:ac329ad1-edb3-4891-9a01-4d5e568d082e,Namespace:calico-apiserver,Attempt:0,}" Jan 14 01:41:40.900263 containerd[1676]: time="2026-01-14T01:41:40.900172759Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7f846496cf-nkwmn,Uid:e6b0fe41-664b-4db1-a6b0-8f9f42046133,Namespace:calico-system,Attempt:0,}" Jan 14 01:41:40.903700 containerd[1676]: time="2026-01-14T01:41:40.903624448Z" level=error msg="Failed to destroy network for sandbox \"b2d69c2b513a601478ea5839e241953629a861e4b74ffddb13f9e616d773452e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.907035 containerd[1676]: time="2026-01-14T01:41:40.906971137Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b67dd7dbc-k5lcl,Uid:8f23c871-1821-4e53-80e3-947513960a4b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2d69c2b513a601478ea5839e241953629a861e4b74ffddb13f9e616d773452e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.907604 kubelet[2905]: E0114 01:41:40.907486 2905 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2d69c2b513a601478ea5839e241953629a861e4b74ffddb13f9e616d773452e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.908279 kubelet[2905]: E0114 01:41:40.908253 2905 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2d69c2b513a601478ea5839e241953629a861e4b74ffddb13f9e616d773452e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" Jan 14 01:41:40.908704 kubelet[2905]: E0114 01:41:40.908388 2905 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2d69c2b513a601478ea5839e241953629a861e4b74ffddb13f9e616d773452e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" Jan 14 01:41:40.908704 kubelet[2905]: E0114 01:41:40.908439 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-b67dd7dbc-k5lcl_calico-system(8f23c871-1821-4e53-80e3-947513960a4b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-b67dd7dbc-k5lcl_calico-system(8f23c871-1821-4e53-80e3-947513960a4b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b2d69c2b513a601478ea5839e241953629a861e4b74ffddb13f9e616d773452e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" podUID="8f23c871-1821-4e53-80e3-947513960a4b" Jan 14 01:41:40.909063 containerd[1676]: time="2026-01-14T01:41:40.908615421Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-jg7bs,Uid:e6d84f7a-c9f6-41d8-94ff-304c6e803e1e,Namespace:calico-system,Attempt:0,}" Jan 14 01:41:40.948879 containerd[1676]: time="2026-01-14T01:41:40.948830882Z" level=error msg="Failed to destroy network for sandbox \"937e5ebae7190a5553a96606fc1d91461ea6d3eda1bc8a0ec3ab1e44898d9bcd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.952624 containerd[1676]: time="2026-01-14T01:41:40.952009130Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5gt6s,Uid:7650f1f8-3708-460a-ab38-5c398404750b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"937e5ebae7190a5553a96606fc1d91461ea6d3eda1bc8a0ec3ab1e44898d9bcd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.953580 kubelet[2905]: E0114 01:41:40.952237 2905 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"937e5ebae7190a5553a96606fc1d91461ea6d3eda1bc8a0ec3ab1e44898d9bcd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.953580 kubelet[2905]: E0114 01:41:40.952290 2905 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"937e5ebae7190a5553a96606fc1d91461ea6d3eda1bc8a0ec3ab1e44898d9bcd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-5gt6s" Jan 14 01:41:40.953580 kubelet[2905]: E0114 01:41:40.952314 2905 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"937e5ebae7190a5553a96606fc1d91461ea6d3eda1bc8a0ec3ab1e44898d9bcd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-5gt6s" Jan 14 01:41:40.953687 kubelet[2905]: E0114 01:41:40.952355 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-5gt6s_kube-system(7650f1f8-3708-460a-ab38-5c398404750b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-5gt6s_kube-system(7650f1f8-3708-460a-ab38-5c398404750b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"937e5ebae7190a5553a96606fc1d91461ea6d3eda1bc8a0ec3ab1e44898d9bcd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-5gt6s" podUID="7650f1f8-3708-460a-ab38-5c398404750b" Jan 14 01:41:40.962530 containerd[1676]: time="2026-01-14T01:41:40.962481996Z" level=error msg="Failed to destroy network for sandbox \"1b79066f00a60e8604bc18e7573237c9f256d7f2e79c8b0f02396ff455b69b10\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.964700 containerd[1676]: time="2026-01-14T01:41:40.964643521Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-fk5c6,Uid:4d0d03be-d2cc-4c75-ae9e-07c3631ee9eb,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b79066f00a60e8604bc18e7573237c9f256d7f2e79c8b0f02396ff455b69b10\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.965487 kubelet[2905]: E0114 01:41:40.964976 2905 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b79066f00a60e8604bc18e7573237c9f256d7f2e79c8b0f02396ff455b69b10\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.965487 kubelet[2905]: E0114 01:41:40.965030 2905 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b79066f00a60e8604bc18e7573237c9f256d7f2e79c8b0f02396ff455b69b10\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-fk5c6" Jan 14 01:41:40.965487 kubelet[2905]: E0114 01:41:40.965053 2905 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b79066f00a60e8604bc18e7573237c9f256d7f2e79c8b0f02396ff455b69b10\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-fk5c6" Jan 14 01:41:40.965630 kubelet[2905]: E0114 01:41:40.965095 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-fk5c6_kube-system(4d0d03be-d2cc-4c75-ae9e-07c3631ee9eb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-fk5c6_kube-system(4d0d03be-d2cc-4c75-ae9e-07c3631ee9eb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1b79066f00a60e8604bc18e7573237c9f256d7f2e79c8b0f02396ff455b69b10\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-fk5c6" podUID="4d0d03be-d2cc-4c75-ae9e-07c3631ee9eb" Jan 14 01:41:40.980823 containerd[1676]: time="2026-01-14T01:41:40.980760442Z" level=error msg="Failed to destroy network for sandbox \"1f6d00b4ec444fbee2dbb9915abb2208c52493f8643eba29fc0150059c7ff6c2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.982877 containerd[1676]: time="2026-01-14T01:41:40.982838327Z" level=error msg="Failed to destroy network for sandbox \"f5d41ec03072bdc4bdf494ebc8d5f7c14e77dcea1b86afa8ba48696f78a445e9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.983629 containerd[1676]: time="2026-01-14T01:41:40.983578689Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-675898b8d4-dphrr,Uid:adf9db04-ef07-4e4b-ac7b-0a044973dca8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1f6d00b4ec444fbee2dbb9915abb2208c52493f8643eba29fc0150059c7ff6c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.984112 kubelet[2905]: E0114 01:41:40.984069 2905 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1f6d00b4ec444fbee2dbb9915abb2208c52493f8643eba29fc0150059c7ff6c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.984293 kubelet[2905]: E0114 01:41:40.984233 2905 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1f6d00b4ec444fbee2dbb9915abb2208c52493f8643eba29fc0150059c7ff6c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" Jan 14 01:41:40.984430 kubelet[2905]: E0114 01:41:40.984404 2905 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1f6d00b4ec444fbee2dbb9915abb2208c52493f8643eba29fc0150059c7ff6c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" Jan 14 01:41:40.984604 kubelet[2905]: E0114 01:41:40.984577 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-675898b8d4-dphrr_calico-apiserver(adf9db04-ef07-4e4b-ac7b-0a044973dca8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-675898b8d4-dphrr_calico-apiserver(adf9db04-ef07-4e4b-ac7b-0a044973dca8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1f6d00b4ec444fbee2dbb9915abb2208c52493f8643eba29fc0150059c7ff6c2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" podUID="adf9db04-ef07-4e4b-ac7b-0a044973dca8" Jan 14 01:41:40.985766 containerd[1676]: time="2026-01-14T01:41:40.985632534Z" level=error msg="Failed to destroy network for sandbox \"7b508dc1ab347315a09b93e7b0221d512dfc581ada9192e1402233b3f9019db0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.987237 containerd[1676]: time="2026-01-14T01:41:40.987052657Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7f846496cf-nkwmn,Uid:e6b0fe41-664b-4db1-a6b0-8f9f42046133,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5d41ec03072bdc4bdf494ebc8d5f7c14e77dcea1b86afa8ba48696f78a445e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.987391 kubelet[2905]: E0114 01:41:40.987351 2905 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5d41ec03072bdc4bdf494ebc8d5f7c14e77dcea1b86afa8ba48696f78a445e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.987435 kubelet[2905]: E0114 01:41:40.987407 2905 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5d41ec03072bdc4bdf494ebc8d5f7c14e77dcea1b86afa8ba48696f78a445e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7f846496cf-nkwmn" Jan 14 01:41:40.987461 kubelet[2905]: E0114 01:41:40.987426 2905 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5d41ec03072bdc4bdf494ebc8d5f7c14e77dcea1b86afa8ba48696f78a445e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7f846496cf-nkwmn" Jan 14 01:41:40.987531 kubelet[2905]: E0114 01:41:40.987502 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7f846496cf-nkwmn_calico-system(e6b0fe41-664b-4db1-a6b0-8f9f42046133)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7f846496cf-nkwmn_calico-system(e6b0fe41-664b-4db1-a6b0-8f9f42046133)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f5d41ec03072bdc4bdf494ebc8d5f7c14e77dcea1b86afa8ba48696f78a445e9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7f846496cf-nkwmn" podUID="e6b0fe41-664b-4db1-a6b0-8f9f42046133" Jan 14 01:41:40.989751 containerd[1676]: time="2026-01-14T01:41:40.989533584Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-675898b8d4-fz6xg,Uid:ac329ad1-edb3-4891-9a01-4d5e568d082e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b508dc1ab347315a09b93e7b0221d512dfc581ada9192e1402233b3f9019db0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.989846 kubelet[2905]: E0114 01:41:40.989741 2905 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b508dc1ab347315a09b93e7b0221d512dfc581ada9192e1402233b3f9019db0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.989846 kubelet[2905]: E0114 01:41:40.989782 2905 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b508dc1ab347315a09b93e7b0221d512dfc581ada9192e1402233b3f9019db0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" Jan 14 01:41:40.989846 kubelet[2905]: E0114 01:41:40.989801 2905 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b508dc1ab347315a09b93e7b0221d512dfc581ada9192e1402233b3f9019db0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" Jan 14 01:41:40.989923 kubelet[2905]: E0114 01:41:40.989842 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-675898b8d4-fz6xg_calico-apiserver(ac329ad1-edb3-4891-9a01-4d5e568d082e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-675898b8d4-fz6xg_calico-apiserver(ac329ad1-edb3-4891-9a01-4d5e568d082e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7b508dc1ab347315a09b93e7b0221d512dfc581ada9192e1402233b3f9019db0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" podUID="ac329ad1-edb3-4891-9a01-4d5e568d082e" Jan 14 01:41:40.991570 containerd[1676]: time="2026-01-14T01:41:40.991535109Z" level=error msg="Failed to destroy network for sandbox \"ef4012f54bf170574c6f163d28e47541434ce8bb06d50cf1bc478a4c1cbbd11d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.994210 containerd[1676]: time="2026-01-14T01:41:40.993705154Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-jg7bs,Uid:e6d84f7a-c9f6-41d8-94ff-304c6e803e1e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef4012f54bf170574c6f163d28e47541434ce8bb06d50cf1bc478a4c1cbbd11d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.994356 kubelet[2905]: E0114 01:41:40.993966 2905 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef4012f54bf170574c6f163d28e47541434ce8bb06d50cf1bc478a4c1cbbd11d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 01:41:40.994356 kubelet[2905]: E0114 01:41:40.994000 2905 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef4012f54bf170574c6f163d28e47541434ce8bb06d50cf1bc478a4c1cbbd11d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-jg7bs" Jan 14 01:41:40.994356 kubelet[2905]: E0114 01:41:40.994035 2905 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef4012f54bf170574c6f163d28e47541434ce8bb06d50cf1bc478a4c1cbbd11d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-jg7bs" Jan 14 01:41:40.994460 kubelet[2905]: E0114 01:41:40.994078 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-jg7bs_calico-system(e6d84f7a-c9f6-41d8-94ff-304c6e803e1e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-jg7bs_calico-system(e6d84f7a-c9f6-41d8-94ff-304c6e803e1e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ef4012f54bf170574c6f163d28e47541434ce8bb06d50cf1bc478a4c1cbbd11d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-jg7bs" podUID="e6d84f7a-c9f6-41d8-94ff-304c6e803e1e" Jan 14 01:41:41.329360 containerd[1676]: time="2026-01-14T01:41:41.329318636Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 14 01:41:47.561701 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3251830329.mount: Deactivated successfully. Jan 14 01:41:47.582623 containerd[1676]: time="2026-01-14T01:41:47.582568241Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:41:47.584096 containerd[1676]: time="2026-01-14T01:41:47.584044445Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150930912" Jan 14 01:41:47.584937 containerd[1676]: time="2026-01-14T01:41:47.584896527Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:41:47.587414 containerd[1676]: time="2026-01-14T01:41:47.587378493Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 01:41:47.588489 containerd[1676]: time="2026-01-14T01:41:47.588453456Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 6.25907406s" Jan 14 01:41:47.588538 containerd[1676]: time="2026-01-14T01:41:47.588487936Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Jan 14 01:41:47.602820 containerd[1676]: time="2026-01-14T01:41:47.602775012Z" level=info msg="CreateContainer within sandbox \"994c2c014c1c08403335b6ee806ffeba314de50a59edb63eb1475fdbe621bfd1\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 14 01:41:47.611861 containerd[1676]: time="2026-01-14T01:41:47.610809232Z" level=info msg="Container 5ed974016f2e3b3315d5061620fca6e0df4aadf04b851d5e449a6f828d401095: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:41:47.622708 containerd[1676]: time="2026-01-14T01:41:47.622653982Z" level=info msg="CreateContainer within sandbox \"994c2c014c1c08403335b6ee806ffeba314de50a59edb63eb1475fdbe621bfd1\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"5ed974016f2e3b3315d5061620fca6e0df4aadf04b851d5e449a6f828d401095\"" Jan 14 01:41:47.623558 containerd[1676]: time="2026-01-14T01:41:47.623516784Z" level=info msg="StartContainer for \"5ed974016f2e3b3315d5061620fca6e0df4aadf04b851d5e449a6f828d401095\"" Jan 14 01:41:47.625634 containerd[1676]: time="2026-01-14T01:41:47.625596309Z" level=info msg="connecting to shim 5ed974016f2e3b3315d5061620fca6e0df4aadf04b851d5e449a6f828d401095" address="unix:///run/containerd/s/a9194509a4f24f78b4c5c9cb0646a82440dc2a2cea687db7dffd5b7d013ea7d5" protocol=ttrpc version=3 Jan 14 01:41:47.647019 systemd[1]: Started cri-containerd-5ed974016f2e3b3315d5061620fca6e0df4aadf04b851d5e449a6f828d401095.scope - libcontainer container 5ed974016f2e3b3315d5061620fca6e0df4aadf04b851d5e449a6f828d401095. Jan 14 01:41:47.705000 audit: BPF prog-id=177 op=LOAD Jan 14 01:41:47.708344 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 14 01:41:47.708408 kernel: audit: type=1334 audit(1768354907.705:582): prog-id=177 op=LOAD Jan 14 01:41:47.708438 kernel: audit: type=1300 audit(1768354907.705:582): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3491 pid=4059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:47.705000 audit[4059]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3491 pid=4059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:47.705000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565643937343031366632653362333331356435303631363230666361 Jan 14 01:41:47.715082 kernel: audit: type=1327 audit(1768354907.705:582): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565643937343031366632653362333331356435303631363230666361 Jan 14 01:41:47.715186 kernel: audit: type=1334 audit(1768354907.705:583): prog-id=178 op=LOAD Jan 14 01:41:47.705000 audit: BPF prog-id=178 op=LOAD Jan 14 01:41:47.705000 audit[4059]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3491 pid=4059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:47.719659 kernel: audit: type=1300 audit(1768354907.705:583): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3491 pid=4059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:47.705000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565643937343031366632653362333331356435303631363230666361 Jan 14 01:41:47.723123 kernel: audit: type=1327 audit(1768354907.705:583): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565643937343031366632653362333331356435303631363230666361 Jan 14 01:41:47.723174 kernel: audit: type=1334 audit(1768354907.706:584): prog-id=178 op=UNLOAD Jan 14 01:41:47.706000 audit: BPF prog-id=178 op=UNLOAD Jan 14 01:41:47.706000 audit[4059]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=4059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:47.727261 kernel: audit: type=1300 audit(1768354907.706:584): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=4059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:47.706000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565643937343031366632653362333331356435303631363230666361 Jan 14 01:41:47.730710 kernel: audit: type=1327 audit(1768354907.706:584): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565643937343031366632653362333331356435303631363230666361 Jan 14 01:41:47.706000 audit: BPF prog-id=177 op=UNLOAD Jan 14 01:41:47.731834 kernel: audit: type=1334 audit(1768354907.706:585): prog-id=177 op=UNLOAD Jan 14 01:41:47.706000 audit[4059]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=4059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:47.706000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565643937343031366632653362333331356435303631363230666361 Jan 14 01:41:47.706000 audit: BPF prog-id=179 op=LOAD Jan 14 01:41:47.706000 audit[4059]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3491 pid=4059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:47.706000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565643937343031366632653362333331356435303631363230666361 Jan 14 01:41:47.749784 containerd[1676]: time="2026-01-14T01:41:47.749645820Z" level=info msg="StartContainer for \"5ed974016f2e3b3315d5061620fca6e0df4aadf04b851d5e449a6f828d401095\" returns successfully" Jan 14 01:41:47.884245 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 14 01:41:47.884433 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 14 01:41:48.096166 kubelet[2905]: I0114 01:41:48.096131 2905 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6b0fe41-664b-4db1-a6b0-8f9f42046133-whisker-ca-bundle\") pod \"e6b0fe41-664b-4db1-a6b0-8f9f42046133\" (UID: \"e6b0fe41-664b-4db1-a6b0-8f9f42046133\") " Jan 14 01:41:48.096538 kubelet[2905]: I0114 01:41:48.096503 2905 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5vfh\" (UniqueName: \"kubernetes.io/projected/e6b0fe41-664b-4db1-a6b0-8f9f42046133-kube-api-access-n5vfh\") pod \"e6b0fe41-664b-4db1-a6b0-8f9f42046133\" (UID: \"e6b0fe41-664b-4db1-a6b0-8f9f42046133\") " Jan 14 01:41:48.096570 kubelet[2905]: I0114 01:41:48.096547 2905 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e6b0fe41-664b-4db1-a6b0-8f9f42046133-whisker-backend-key-pair\") pod \"e6b0fe41-664b-4db1-a6b0-8f9f42046133\" (UID: \"e6b0fe41-664b-4db1-a6b0-8f9f42046133\") " Jan 14 01:41:48.097169 kubelet[2905]: I0114 01:41:48.097065 2905 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b0fe41-664b-4db1-a6b0-8f9f42046133-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "e6b0fe41-664b-4db1-a6b0-8f9f42046133" (UID: "e6b0fe41-664b-4db1-a6b0-8f9f42046133"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 14 01:41:48.099919 kubelet[2905]: I0114 01:41:48.099863 2905 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6b0fe41-664b-4db1-a6b0-8f9f42046133-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "e6b0fe41-664b-4db1-a6b0-8f9f42046133" (UID: "e6b0fe41-664b-4db1-a6b0-8f9f42046133"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 14 01:41:48.100997 kubelet[2905]: I0114 01:41:48.100969 2905 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b0fe41-664b-4db1-a6b0-8f9f42046133-kube-api-access-n5vfh" (OuterVolumeSpecName: "kube-api-access-n5vfh") pod "e6b0fe41-664b-4db1-a6b0-8f9f42046133" (UID: "e6b0fe41-664b-4db1-a6b0-8f9f42046133"). InnerVolumeSpecName "kube-api-access-n5vfh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 14 01:41:48.198032 kubelet[2905]: I0114 01:41:48.197960 2905 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e6b0fe41-664b-4db1-a6b0-8f9f42046133-whisker-backend-key-pair\") on node \"ci-4578-0-0-p-96753e66ce\" DevicePath \"\"" Jan 14 01:41:48.198032 kubelet[2905]: I0114 01:41:48.197997 2905 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6b0fe41-664b-4db1-a6b0-8f9f42046133-whisker-ca-bundle\") on node \"ci-4578-0-0-p-96753e66ce\" DevicePath \"\"" Jan 14 01:41:48.198032 kubelet[2905]: I0114 01:41:48.198007 2905 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n5vfh\" (UniqueName: \"kubernetes.io/projected/e6b0fe41-664b-4db1-a6b0-8f9f42046133-kube-api-access-n5vfh\") on node \"ci-4578-0-0-p-96753e66ce\" DevicePath \"\"" Jan 14 01:41:48.351657 systemd[1]: Removed slice kubepods-besteffort-pode6b0fe41_664b_4db1_a6b0_8f9f42046133.slice - libcontainer container kubepods-besteffort-pode6b0fe41_664b_4db1_a6b0_8f9f42046133.slice. Jan 14 01:41:48.365128 kubelet[2905]: I0114 01:41:48.364993 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-r5qs2" podStartSLOduration=1.3738849100000001 podStartE2EDuration="19.364976564s" podCreationTimestamp="2026-01-14 01:41:29 +0000 UTC" firstStartedPulling="2026-01-14 01:41:29.598195564 +0000 UTC m=+24.471681639" lastFinishedPulling="2026-01-14 01:41:47.589287218 +0000 UTC m=+42.462773293" observedRunningTime="2026-01-14 01:41:48.364345322 +0000 UTC m=+43.237831437" watchObservedRunningTime="2026-01-14 01:41:48.364976564 +0000 UTC m=+43.238462679" Jan 14 01:41:48.420851 systemd[1]: Created slice kubepods-besteffort-podafb8f8b9_3d6e_46a2_b7ae_ef74d2aa4a11.slice - libcontainer container kubepods-besteffort-podafb8f8b9_3d6e_46a2_b7ae_ef74d2aa4a11.slice. Jan 14 01:41:48.500113 kubelet[2905]: I0114 01:41:48.500026 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11-whisker-ca-bundle\") pod \"whisker-77b7df9c9-vcm8x\" (UID: \"afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11\") " pod="calico-system/whisker-77b7df9c9-vcm8x" Jan 14 01:41:48.500113 kubelet[2905]: I0114 01:41:48.500089 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtqjc\" (UniqueName: \"kubernetes.io/projected/afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11-kube-api-access-gtqjc\") pod \"whisker-77b7df9c9-vcm8x\" (UID: \"afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11\") " pod="calico-system/whisker-77b7df9c9-vcm8x" Jan 14 01:41:48.500446 kubelet[2905]: I0114 01:41:48.500179 2905 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11-whisker-backend-key-pair\") pod \"whisker-77b7df9c9-vcm8x\" (UID: \"afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11\") " pod="calico-system/whisker-77b7df9c9-vcm8x" Jan 14 01:41:48.562478 systemd[1]: var-lib-kubelet-pods-e6b0fe41\x2d664b\x2d4db1\x2da6b0\x2d8f9f42046133-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dn5vfh.mount: Deactivated successfully. Jan 14 01:41:48.562570 systemd[1]: var-lib-kubelet-pods-e6b0fe41\x2d664b\x2d4db1\x2da6b0\x2d8f9f42046133-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 14 01:41:48.725790 containerd[1676]: time="2026-01-14T01:41:48.725658228Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-77b7df9c9-vcm8x,Uid:afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11,Namespace:calico-system,Attempt:0,}" Jan 14 01:41:48.858816 systemd-networkd[1586]: calicd51f06384d: Link UP Jan 14 01:41:48.859012 systemd-networkd[1586]: calicd51f06384d: Gained carrier Jan 14 01:41:48.874545 containerd[1676]: 2026-01-14 01:41:48.748 [INFO][4123] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 01:41:48.874545 containerd[1676]: 2026-01-14 01:41:48.767 [INFO][4123] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--96753e66ce-k8s-whisker--77b7df9c9--vcm8x-eth0 whisker-77b7df9c9- calico-system afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11 894 0 2026-01-14 01:41:48 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:77b7df9c9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4578-0-0-p-96753e66ce whisker-77b7df9c9-vcm8x eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calicd51f06384d [] [] }} ContainerID="4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" Namespace="calico-system" Pod="whisker-77b7df9c9-vcm8x" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-whisker--77b7df9c9--vcm8x-" Jan 14 01:41:48.874545 containerd[1676]: 2026-01-14 01:41:48.767 [INFO][4123] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" Namespace="calico-system" Pod="whisker-77b7df9c9-vcm8x" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-whisker--77b7df9c9--vcm8x-eth0" Jan 14 01:41:48.874545 containerd[1676]: 2026-01-14 01:41:48.809 [INFO][4138] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" HandleID="k8s-pod-network.4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" Workload="ci--4578--0--0--p--96753e66ce-k8s-whisker--77b7df9c9--vcm8x-eth0" Jan 14 01:41:48.874801 containerd[1676]: 2026-01-14 01:41:48.809 [INFO][4138] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" HandleID="k8s-pod-network.4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" Workload="ci--4578--0--0--p--96753e66ce-k8s-whisker--77b7df9c9--vcm8x-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c2fd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4578-0-0-p-96753e66ce", "pod":"whisker-77b7df9c9-vcm8x", "timestamp":"2026-01-14 01:41:48.809418518 +0000 UTC"}, Hostname:"ci-4578-0-0-p-96753e66ce", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:41:48.874801 containerd[1676]: 2026-01-14 01:41:48.809 [INFO][4138] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:41:48.874801 containerd[1676]: 2026-01-14 01:41:48.809 [INFO][4138] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:41:48.874801 containerd[1676]: 2026-01-14 01:41:48.809 [INFO][4138] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-96753e66ce' Jan 14 01:41:48.874801 containerd[1676]: 2026-01-14 01:41:48.819 [INFO][4138] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:48.874801 containerd[1676]: 2026-01-14 01:41:48.826 [INFO][4138] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:48.874801 containerd[1676]: 2026-01-14 01:41:48.830 [INFO][4138] ipam/ipam.go 511: Trying affinity for 192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:48.874801 containerd[1676]: 2026-01-14 01:41:48.832 [INFO][4138] ipam/ipam.go 158: Attempting to load block cidr=192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:48.874801 containerd[1676]: 2026-01-14 01:41:48.834 [INFO][4138] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:48.874976 containerd[1676]: 2026-01-14 01:41:48.834 [INFO][4138] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.20.0/26 handle="k8s-pod-network.4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:48.874976 containerd[1676]: 2026-01-14 01:41:48.836 [INFO][4138] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9 Jan 14 01:41:48.874976 containerd[1676]: 2026-01-14 01:41:48.840 [INFO][4138] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.20.0/26 handle="k8s-pod-network.4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:48.874976 containerd[1676]: 2026-01-14 01:41:48.847 [INFO][4138] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.20.1/26] block=192.168.20.0/26 handle="k8s-pod-network.4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:48.874976 containerd[1676]: 2026-01-14 01:41:48.848 [INFO][4138] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.20.1/26] handle="k8s-pod-network.4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:48.874976 containerd[1676]: 2026-01-14 01:41:48.848 [INFO][4138] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:41:48.874976 containerd[1676]: 2026-01-14 01:41:48.848 [INFO][4138] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.20.1/26] IPv6=[] ContainerID="4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" HandleID="k8s-pod-network.4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" Workload="ci--4578--0--0--p--96753e66ce-k8s-whisker--77b7df9c9--vcm8x-eth0" Jan 14 01:41:48.875107 containerd[1676]: 2026-01-14 01:41:48.850 [INFO][4123] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" Namespace="calico-system" Pod="whisker-77b7df9c9-vcm8x" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-whisker--77b7df9c9--vcm8x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--96753e66ce-k8s-whisker--77b7df9c9--vcm8x-eth0", GenerateName:"whisker-77b7df9c9-", Namespace:"calico-system", SelfLink:"", UID:"afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11", ResourceVersion:"894", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 41, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"77b7df9c9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-96753e66ce", ContainerID:"", Pod:"whisker-77b7df9c9-vcm8x", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.20.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calicd51f06384d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:41:48.875107 containerd[1676]: 2026-01-14 01:41:48.850 [INFO][4123] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.20.1/32] ContainerID="4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" Namespace="calico-system" Pod="whisker-77b7df9c9-vcm8x" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-whisker--77b7df9c9--vcm8x-eth0" Jan 14 01:41:48.875177 containerd[1676]: 2026-01-14 01:41:48.850 [INFO][4123] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicd51f06384d ContainerID="4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" Namespace="calico-system" Pod="whisker-77b7df9c9-vcm8x" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-whisker--77b7df9c9--vcm8x-eth0" Jan 14 01:41:48.875177 containerd[1676]: 2026-01-14 01:41:48.859 [INFO][4123] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" Namespace="calico-system" Pod="whisker-77b7df9c9-vcm8x" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-whisker--77b7df9c9--vcm8x-eth0" Jan 14 01:41:48.875215 containerd[1676]: 2026-01-14 01:41:48.860 [INFO][4123] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" Namespace="calico-system" Pod="whisker-77b7df9c9-vcm8x" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-whisker--77b7df9c9--vcm8x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--96753e66ce-k8s-whisker--77b7df9c9--vcm8x-eth0", GenerateName:"whisker-77b7df9c9-", Namespace:"calico-system", SelfLink:"", UID:"afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11", ResourceVersion:"894", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 41, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"77b7df9c9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-96753e66ce", ContainerID:"4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9", Pod:"whisker-77b7df9c9-vcm8x", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.20.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calicd51f06384d", MAC:"46:10:90:fc:dd:75", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:41:48.875262 containerd[1676]: 2026-01-14 01:41:48.870 [INFO][4123] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" Namespace="calico-system" Pod="whisker-77b7df9c9-vcm8x" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-whisker--77b7df9c9--vcm8x-eth0" Jan 14 01:41:48.893349 containerd[1676]: time="2026-01-14T01:41:48.893278769Z" level=info msg="connecting to shim 4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9" address="unix:///run/containerd/s/ac0d3d9a42646e45b418f2b60800ec09928873dcdf22b079f300d3ebca5ae191" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:41:48.922029 systemd[1]: Started cri-containerd-4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9.scope - libcontainer container 4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9. Jan 14 01:41:48.930000 audit: BPF prog-id=180 op=LOAD Jan 14 01:41:48.931000 audit: BPF prog-id=181 op=LOAD Jan 14 01:41:48.931000 audit[4176]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4164 pid=4176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:48.931000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463666263643562396239333134326237346362353565356532386134 Jan 14 01:41:48.931000 audit: BPF prog-id=181 op=UNLOAD Jan 14 01:41:48.931000 audit[4176]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4164 pid=4176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:48.931000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463666263643562396239333134326237346362353565356532386134 Jan 14 01:41:48.931000 audit: BPF prog-id=182 op=LOAD Jan 14 01:41:48.931000 audit[4176]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4164 pid=4176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:48.931000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463666263643562396239333134326237346362353565356532386134 Jan 14 01:41:48.931000 audit: BPF prog-id=183 op=LOAD Jan 14 01:41:48.931000 audit[4176]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4164 pid=4176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:48.931000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463666263643562396239333134326237346362353565356532386134 Jan 14 01:41:48.931000 audit: BPF prog-id=183 op=UNLOAD Jan 14 01:41:48.931000 audit[4176]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4164 pid=4176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:48.931000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463666263643562396239333134326237346362353565356532386134 Jan 14 01:41:48.931000 audit: BPF prog-id=182 op=UNLOAD Jan 14 01:41:48.931000 audit[4176]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4164 pid=4176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:48.931000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463666263643562396239333134326237346362353565356532386134 Jan 14 01:41:48.931000 audit: BPF prog-id=184 op=LOAD Jan 14 01:41:48.931000 audit[4176]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4164 pid=4176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:48.931000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463666263643562396239333134326237346362353565356532386134 Jan 14 01:41:48.954257 containerd[1676]: time="2026-01-14T01:41:48.954218042Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-77b7df9c9-vcm8x,Uid:afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11,Namespace:calico-system,Attempt:0,} returns sandbox id \"4cfbcd5b9b93142b74cb55e5e28a43b9e35f99c33240ee1582558e070edebed9\"" Jan 14 01:41:48.955747 containerd[1676]: time="2026-01-14T01:41:48.955698325Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:41:49.232227 kubelet[2905]: I0114 01:41:49.232046 2905 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b0fe41-664b-4db1-a6b0-8f9f42046133" path="/var/lib/kubelet/pods/e6b0fe41-664b-4db1-a6b0-8f9f42046133/volumes" Jan 14 01:41:49.296211 containerd[1676]: time="2026-01-14T01:41:49.296157219Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:41:49.297265 containerd[1676]: time="2026-01-14T01:41:49.297224102Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:41:49.297328 containerd[1676]: time="2026-01-14T01:41:49.297281782Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:41:49.297516 kubelet[2905]: E0114 01:41:49.297469 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:41:49.297564 kubelet[2905]: E0114 01:41:49.297540 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:41:49.297712 kubelet[2905]: E0114 01:41:49.297677 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1bdf15913aa7461abab7765f5b689915,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gtqjc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-77b7df9c9-vcm8x_calico-system(afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:41:49.301110 containerd[1676]: time="2026-01-14T01:41:49.301077432Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:41:49.631395 containerd[1676]: time="2026-01-14T01:41:49.631281860Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:41:49.633120 containerd[1676]: time="2026-01-14T01:41:49.633017664Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:41:49.633120 containerd[1676]: time="2026-01-14T01:41:49.633065864Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:41:49.633275 kubelet[2905]: E0114 01:41:49.633221 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:41:49.633275 kubelet[2905]: E0114 01:41:49.633267 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:41:49.633439 kubelet[2905]: E0114 01:41:49.633384 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gtqjc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-77b7df9c9-vcm8x_calico-system(afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:41:49.634604 kubelet[2905]: E0114 01:41:49.634546 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-77b7df9c9-vcm8x" podUID="afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11" Jan 14 01:41:50.351381 kubelet[2905]: E0114 01:41:50.351278 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-77b7df9c9-vcm8x" podUID="afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11" Jan 14 01:41:50.368000 audit[4332]: NETFILTER_CFG table=filter:119 family=2 entries=22 op=nft_register_rule pid=4332 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:50.368000 audit[4332]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffe5cf25d0 a2=0 a3=1 items=0 ppid=3072 pid=4332 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:50.368000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:50.375000 audit[4332]: NETFILTER_CFG table=nat:120 family=2 entries=12 op=nft_register_rule pid=4332 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:50.375000 audit[4332]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffe5cf25d0 a2=0 a3=1 items=0 ppid=3072 pid=4332 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:50.375000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:50.723212 systemd-networkd[1586]: calicd51f06384d: Gained IPv6LL Jan 14 01:41:51.110954 kubelet[2905]: I0114 01:41:51.110792 2905 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 14 01:41:51.182000 audit[4334]: NETFILTER_CFG table=filter:121 family=2 entries=21 op=nft_register_rule pid=4334 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:51.182000 audit[4334]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffdbadfed0 a2=0 a3=1 items=0 ppid=3072 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.182000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:51.190000 audit[4334]: NETFILTER_CFG table=nat:122 family=2 entries=19 op=nft_register_chain pid=4334 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:51.190000 audit[4334]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=ffffdbadfed0 a2=0 a3=1 items=0 ppid=3072 pid=4334 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.190000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:51.498000 audit: BPF prog-id=185 op=LOAD Jan 14 01:41:51.498000 audit[4391]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe64985a8 a2=98 a3=ffffe6498598 items=0 ppid=4344 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.498000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:41:51.498000 audit: BPF prog-id=185 op=UNLOAD Jan 14 01:41:51.498000 audit[4391]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffe6498578 a3=0 items=0 ppid=4344 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.498000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:41:51.499000 audit: BPF prog-id=186 op=LOAD Jan 14 01:41:51.499000 audit[4391]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe6498458 a2=74 a3=95 items=0 ppid=4344 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.499000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:41:51.499000 audit: BPF prog-id=186 op=UNLOAD Jan 14 01:41:51.499000 audit[4391]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4344 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.499000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:41:51.499000 audit: BPF prog-id=187 op=LOAD Jan 14 01:41:51.499000 audit[4391]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe6498488 a2=40 a3=ffffe64984b8 items=0 ppid=4344 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.499000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:41:51.499000 audit: BPF prog-id=187 op=UNLOAD Jan 14 01:41:51.499000 audit[4391]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffe64984b8 items=0 ppid=4344 pid=4391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.499000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 01:41:51.501000 audit: BPF prog-id=188 op=LOAD Jan 14 01:41:51.501000 audit[4394]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd617e868 a2=98 a3=ffffd617e858 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.501000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.502000 audit: BPF prog-id=188 op=UNLOAD Jan 14 01:41:51.502000 audit[4394]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd617e838 a3=0 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.502000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.502000 audit: BPF prog-id=189 op=LOAD Jan 14 01:41:51.502000 audit[4394]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffd617e4f8 a2=74 a3=95 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.502000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.503000 audit: BPF prog-id=189 op=UNLOAD Jan 14 01:41:51.503000 audit[4394]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.503000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.503000 audit: BPF prog-id=190 op=LOAD Jan 14 01:41:51.503000 audit[4394]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffd617e558 a2=94 a3=2 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.503000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.503000 audit: BPF prog-id=190 op=UNLOAD Jan 14 01:41:51.503000 audit[4394]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.503000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.603000 audit: BPF prog-id=191 op=LOAD Jan 14 01:41:51.603000 audit[4394]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffd617e518 a2=40 a3=ffffd617e548 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.603000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.603000 audit: BPF prog-id=191 op=UNLOAD Jan 14 01:41:51.603000 audit[4394]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffd617e548 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.603000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.612000 audit: BPF prog-id=192 op=LOAD Jan 14 01:41:51.612000 audit[4394]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffd617e528 a2=94 a3=4 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.612000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.612000 audit: BPF prog-id=192 op=UNLOAD Jan 14 01:41:51.612000 audit[4394]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.612000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.613000 audit: BPF prog-id=193 op=LOAD Jan 14 01:41:51.613000 audit[4394]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd617e368 a2=94 a3=5 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.613000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.613000 audit: BPF prog-id=193 op=UNLOAD Jan 14 01:41:51.613000 audit[4394]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.613000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.613000 audit: BPF prog-id=194 op=LOAD Jan 14 01:41:51.613000 audit[4394]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffd617e598 a2=94 a3=6 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.613000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.613000 audit: BPF prog-id=194 op=UNLOAD Jan 14 01:41:51.613000 audit[4394]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.613000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.613000 audit: BPF prog-id=195 op=LOAD Jan 14 01:41:51.613000 audit[4394]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffd617dd68 a2=94 a3=83 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.613000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.613000 audit: BPF prog-id=196 op=LOAD Jan 14 01:41:51.613000 audit[4394]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffd617db28 a2=94 a3=2 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.613000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.613000 audit: BPF prog-id=196 op=UNLOAD Jan 14 01:41:51.613000 audit[4394]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.613000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.614000 audit: BPF prog-id=195 op=UNLOAD Jan 14 01:41:51.614000 audit[4394]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=1ce91620 a3=1ce84b00 items=0 ppid=4344 pid=4394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.614000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 01:41:51.623000 audit: BPF prog-id=197 op=LOAD Jan 14 01:41:51.623000 audit[4398]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe31ada08 a2=98 a3=ffffe31ad9f8 items=0 ppid=4344 pid=4398 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.623000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:41:51.623000 audit: BPF prog-id=197 op=UNLOAD Jan 14 01:41:51.623000 audit[4398]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffe31ad9d8 a3=0 items=0 ppid=4344 pid=4398 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.623000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:41:51.623000 audit: BPF prog-id=198 op=LOAD Jan 14 01:41:51.623000 audit[4398]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe31ad8b8 a2=74 a3=95 items=0 ppid=4344 pid=4398 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.623000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:41:51.623000 audit: BPF prog-id=198 op=UNLOAD Jan 14 01:41:51.623000 audit[4398]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4344 pid=4398 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.623000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:41:51.623000 audit: BPF prog-id=199 op=LOAD Jan 14 01:41:51.623000 audit[4398]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe31ad8e8 a2=40 a3=ffffe31ad918 items=0 ppid=4344 pid=4398 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.623000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:41:51.623000 audit: BPF prog-id=199 op=UNLOAD Jan 14 01:41:51.623000 audit[4398]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffe31ad918 items=0 ppid=4344 pid=4398 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.623000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 01:41:51.690246 systemd-networkd[1586]: vxlan.calico: Link UP Jan 14 01:41:51.690252 systemd-networkd[1586]: vxlan.calico: Gained carrier Jan 14 01:41:51.710000 audit: BPF prog-id=200 op=LOAD Jan 14 01:41:51.710000 audit[4421]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffcf7655c8 a2=98 a3=ffffcf7655b8 items=0 ppid=4344 pid=4421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.710000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:41:51.710000 audit: BPF prog-id=200 op=UNLOAD Jan 14 01:41:51.710000 audit[4421]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffcf765598 a3=0 items=0 ppid=4344 pid=4421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.710000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:41:51.710000 audit: BPF prog-id=201 op=LOAD Jan 14 01:41:51.710000 audit[4421]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffcf7652a8 a2=74 a3=95 items=0 ppid=4344 pid=4421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.710000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:41:51.710000 audit: BPF prog-id=201 op=UNLOAD Jan 14 01:41:51.710000 audit[4421]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4344 pid=4421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.710000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:41:51.710000 audit: BPF prog-id=202 op=LOAD Jan 14 01:41:51.710000 audit[4421]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffcf765308 a2=94 a3=2 items=0 ppid=4344 pid=4421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.710000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:41:51.710000 audit: BPF prog-id=202 op=UNLOAD Jan 14 01:41:51.710000 audit[4421]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=70 a3=2 items=0 ppid=4344 pid=4421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.710000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:41:51.710000 audit: BPF prog-id=203 op=LOAD Jan 14 01:41:51.710000 audit[4421]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffcf765188 a2=40 a3=ffffcf7651b8 items=0 ppid=4344 pid=4421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.710000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:41:51.711000 audit: BPF prog-id=203 op=UNLOAD Jan 14 01:41:51.711000 audit[4421]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=40 a3=ffffcf7651b8 items=0 ppid=4344 pid=4421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.711000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:41:51.711000 audit: BPF prog-id=204 op=LOAD Jan 14 01:41:51.711000 audit[4421]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffcf7652d8 a2=94 a3=b7 items=0 ppid=4344 pid=4421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.711000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:41:51.711000 audit: BPF prog-id=204 op=UNLOAD Jan 14 01:41:51.711000 audit[4421]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=b7 items=0 ppid=4344 pid=4421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.711000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:41:51.711000 audit: BPF prog-id=205 op=LOAD Jan 14 01:41:51.711000 audit[4421]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffcf764988 a2=94 a3=2 items=0 ppid=4344 pid=4421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.711000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:41:51.711000 audit: BPF prog-id=205 op=UNLOAD Jan 14 01:41:51.711000 audit[4421]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=2 items=0 ppid=4344 pid=4421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.711000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:41:51.711000 audit: BPF prog-id=206 op=LOAD Jan 14 01:41:51.711000 audit[4421]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffcf764b18 a2=94 a3=30 items=0 ppid=4344 pid=4421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.711000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 01:41:51.714000 audit: BPF prog-id=207 op=LOAD Jan 14 01:41:51.714000 audit[4425]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdede6df8 a2=98 a3=ffffdede6de8 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.714000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.714000 audit: BPF prog-id=207 op=UNLOAD Jan 14 01:41:51.714000 audit[4425]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffdede6dc8 a3=0 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.714000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.714000 audit: BPF prog-id=208 op=LOAD Jan 14 01:41:51.714000 audit[4425]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffdede6a88 a2=74 a3=95 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.714000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.714000 audit: BPF prog-id=208 op=UNLOAD Jan 14 01:41:51.714000 audit[4425]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.714000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.714000 audit: BPF prog-id=209 op=LOAD Jan 14 01:41:51.714000 audit[4425]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffdede6ae8 a2=94 a3=2 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.714000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.714000 audit: BPF prog-id=209 op=UNLOAD Jan 14 01:41:51.714000 audit[4425]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.714000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.818000 audit: BPF prog-id=210 op=LOAD Jan 14 01:41:51.818000 audit[4425]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffdede6aa8 a2=40 a3=ffffdede6ad8 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.818000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.818000 audit: BPF prog-id=210 op=UNLOAD Jan 14 01:41:51.818000 audit[4425]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffdede6ad8 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.818000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.827000 audit: BPF prog-id=211 op=LOAD Jan 14 01:41:51.827000 audit[4425]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffdede6ab8 a2=94 a3=4 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.827000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.828000 audit: BPF prog-id=211 op=UNLOAD Jan 14 01:41:51.828000 audit[4425]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.828000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.828000 audit: BPF prog-id=212 op=LOAD Jan 14 01:41:51.828000 audit[4425]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffdede68f8 a2=94 a3=5 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.828000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.828000 audit: BPF prog-id=212 op=UNLOAD Jan 14 01:41:51.828000 audit[4425]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.828000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.828000 audit: BPF prog-id=213 op=LOAD Jan 14 01:41:51.828000 audit[4425]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffdede6b28 a2=94 a3=6 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.828000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.828000 audit: BPF prog-id=213 op=UNLOAD Jan 14 01:41:51.828000 audit[4425]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.828000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.828000 audit: BPF prog-id=214 op=LOAD Jan 14 01:41:51.828000 audit[4425]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffdede62f8 a2=94 a3=83 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.828000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.829000 audit: BPF prog-id=215 op=LOAD Jan 14 01:41:51.829000 audit[4425]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffdede60b8 a2=94 a3=2 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.829000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.829000 audit: BPF prog-id=215 op=UNLOAD Jan 14 01:41:51.829000 audit[4425]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.829000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.829000 audit: BPF prog-id=214 op=UNLOAD Jan 14 01:41:51.829000 audit[4425]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=2a4c8620 a3=2a4bbb00 items=0 ppid=4344 pid=4425 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.829000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 01:41:51.844000 audit: BPF prog-id=206 op=UNLOAD Jan 14 01:41:51.844000 audit[4344]: SYSCALL arch=c00000b7 syscall=35 success=yes exit=0 a0=ffffffffffffff9c a1=4000c0e4c0 a2=0 a3=0 items=0 ppid=4203 pid=4344 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.844000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 14 01:41:51.888000 audit[4452]: NETFILTER_CFG table=mangle:123 family=2 entries=16 op=nft_register_chain pid=4452 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:41:51.888000 audit[4452]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=fffff72b1150 a2=0 a3=ffff9110dfa8 items=0 ppid=4344 pid=4452 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.888000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:41:51.891000 audit[4454]: NETFILTER_CFG table=nat:124 family=2 entries=15 op=nft_register_chain pid=4454 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:41:51.891000 audit[4454]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=ffffed3bacc0 a2=0 a3=ffff963c3fa8 items=0 ppid=4344 pid=4454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.891000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:41:51.898000 audit[4455]: NETFILTER_CFG table=raw:125 family=2 entries=21 op=nft_register_chain pid=4455 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:41:51.898000 audit[4455]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=ffffedea4160 a2=0 a3=ffff92ef0fa8 items=0 ppid=4344 pid=4455 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.898000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:41:51.900000 audit[4453]: NETFILTER_CFG table=filter:126 family=2 entries=94 op=nft_register_chain pid=4453 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:41:51.900000 audit[4453]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=53116 a0=3 a1=ffffde26f4f0 a2=0 a3=ffffac82efa8 items=0 ppid=4344 pid=4453 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:51.900000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:41:52.229858 containerd[1676]: time="2026-01-14T01:41:52.229745538Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-675898b8d4-fz6xg,Uid:ac329ad1-edb3-4891-9a01-4d5e568d082e,Namespace:calico-apiserver,Attempt:0,}" Jan 14 01:41:52.328934 systemd-networkd[1586]: calid631348a93b: Link UP Jan 14 01:41:52.329477 systemd-networkd[1586]: calid631348a93b: Gained carrier Jan 14 01:41:52.343929 containerd[1676]: 2026-01-14 01:41:52.267 [INFO][4468] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--fz6xg-eth0 calico-apiserver-675898b8d4- calico-apiserver ac329ad1-edb3-4891-9a01-4d5e568d082e 827 0 2026-01-14 01:41:22 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:675898b8d4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4578-0-0-p-96753e66ce calico-apiserver-675898b8d4-fz6xg eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid631348a93b [] [] }} ContainerID="b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" Namespace="calico-apiserver" Pod="calico-apiserver-675898b8d4-fz6xg" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--fz6xg-" Jan 14 01:41:52.343929 containerd[1676]: 2026-01-14 01:41:52.267 [INFO][4468] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" Namespace="calico-apiserver" Pod="calico-apiserver-675898b8d4-fz6xg" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--fz6xg-eth0" Jan 14 01:41:52.343929 containerd[1676]: 2026-01-14 01:41:52.289 [INFO][4482] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" HandleID="k8s-pod-network.b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" Workload="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--fz6xg-eth0" Jan 14 01:41:52.344344 containerd[1676]: 2026-01-14 01:41:52.289 [INFO][4482] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" HandleID="k8s-pod-network.b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" Workload="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--fz6xg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002dd860), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4578-0-0-p-96753e66ce", "pod":"calico-apiserver-675898b8d4-fz6xg", "timestamp":"2026-01-14 01:41:52.289424727 +0000 UTC"}, Hostname:"ci-4578-0-0-p-96753e66ce", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:41:52.344344 containerd[1676]: 2026-01-14 01:41:52.289 [INFO][4482] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:41:52.344344 containerd[1676]: 2026-01-14 01:41:52.290 [INFO][4482] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:41:52.344344 containerd[1676]: 2026-01-14 01:41:52.290 [INFO][4482] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-96753e66ce' Jan 14 01:41:52.344344 containerd[1676]: 2026-01-14 01:41:52.300 [INFO][4482] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:52.344344 containerd[1676]: 2026-01-14 01:41:52.304 [INFO][4482] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:52.344344 containerd[1676]: 2026-01-14 01:41:52.309 [INFO][4482] ipam/ipam.go 511: Trying affinity for 192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:52.344344 containerd[1676]: 2026-01-14 01:41:52.311 [INFO][4482] ipam/ipam.go 158: Attempting to load block cidr=192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:52.344344 containerd[1676]: 2026-01-14 01:41:52.313 [INFO][4482] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:52.344546 containerd[1676]: 2026-01-14 01:41:52.313 [INFO][4482] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.20.0/26 handle="k8s-pod-network.b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:52.344546 containerd[1676]: 2026-01-14 01:41:52.315 [INFO][4482] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451 Jan 14 01:41:52.344546 containerd[1676]: 2026-01-14 01:41:52.319 [INFO][4482] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.20.0/26 handle="k8s-pod-network.b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:52.344546 containerd[1676]: 2026-01-14 01:41:52.324 [INFO][4482] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.20.2/26] block=192.168.20.0/26 handle="k8s-pod-network.b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:52.344546 containerd[1676]: 2026-01-14 01:41:52.324 [INFO][4482] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.20.2/26] handle="k8s-pod-network.b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:52.344546 containerd[1676]: 2026-01-14 01:41:52.324 [INFO][4482] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:41:52.344546 containerd[1676]: 2026-01-14 01:41:52.324 [INFO][4482] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.20.2/26] IPv6=[] ContainerID="b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" HandleID="k8s-pod-network.b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" Workload="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--fz6xg-eth0" Jan 14 01:41:52.344672 containerd[1676]: 2026-01-14 01:41:52.326 [INFO][4468] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" Namespace="calico-apiserver" Pod="calico-apiserver-675898b8d4-fz6xg" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--fz6xg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--fz6xg-eth0", GenerateName:"calico-apiserver-675898b8d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"ac329ad1-edb3-4891-9a01-4d5e568d082e", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 41, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"675898b8d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-96753e66ce", ContainerID:"", Pod:"calico-apiserver-675898b8d4-fz6xg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.20.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid631348a93b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:41:52.344750 containerd[1676]: 2026-01-14 01:41:52.326 [INFO][4468] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.20.2/32] ContainerID="b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" Namespace="calico-apiserver" Pod="calico-apiserver-675898b8d4-fz6xg" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--fz6xg-eth0" Jan 14 01:41:52.344750 containerd[1676]: 2026-01-14 01:41:52.326 [INFO][4468] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid631348a93b ContainerID="b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" Namespace="calico-apiserver" Pod="calico-apiserver-675898b8d4-fz6xg" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--fz6xg-eth0" Jan 14 01:41:52.344750 containerd[1676]: 2026-01-14 01:41:52.330 [INFO][4468] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" Namespace="calico-apiserver" Pod="calico-apiserver-675898b8d4-fz6xg" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--fz6xg-eth0" Jan 14 01:41:52.344818 containerd[1676]: 2026-01-14 01:41:52.330 [INFO][4468] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" Namespace="calico-apiserver" Pod="calico-apiserver-675898b8d4-fz6xg" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--fz6xg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--fz6xg-eth0", GenerateName:"calico-apiserver-675898b8d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"ac329ad1-edb3-4891-9a01-4d5e568d082e", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 41, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"675898b8d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-96753e66ce", ContainerID:"b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451", Pod:"calico-apiserver-675898b8d4-fz6xg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.20.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid631348a93b", MAC:"8a:c2:8e:6d:0f:f1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:41:52.344865 containerd[1676]: 2026-01-14 01:41:52.340 [INFO][4468] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" Namespace="calico-apiserver" Pod="calico-apiserver-675898b8d4-fz6xg" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--fz6xg-eth0" Jan 14 01:41:52.355000 audit[4498]: NETFILTER_CFG table=filter:127 family=2 entries=50 op=nft_register_chain pid=4498 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:41:52.355000 audit[4498]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=28208 a0=3 a1=ffffcdb3dcf0 a2=0 a3=ffff87c14fa8 items=0 ppid=4344 pid=4498 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:52.355000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:41:52.367996 containerd[1676]: time="2026-01-14T01:41:52.367946164Z" level=info msg="connecting to shim b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451" address="unix:///run/containerd/s/c5fc796147d110f1d631d60b94608881610d8e399120e5dc1b91a4ca5fcd0c16" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:41:52.395934 systemd[1]: Started cri-containerd-b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451.scope - libcontainer container b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451. Jan 14 01:41:52.404000 audit: BPF prog-id=216 op=LOAD Jan 14 01:41:52.404000 audit: BPF prog-id=217 op=LOAD Jan 14 01:41:52.404000 audit[4519]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4507 pid=4519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:52.404000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235363433306534656236663565643835346164646565656536343334 Jan 14 01:41:52.404000 audit: BPF prog-id=217 op=UNLOAD Jan 14 01:41:52.404000 audit[4519]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4507 pid=4519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:52.404000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235363433306534656236663565643835346164646565656536343334 Jan 14 01:41:52.404000 audit: BPF prog-id=218 op=LOAD Jan 14 01:41:52.404000 audit[4519]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4507 pid=4519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:52.404000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235363433306534656236663565643835346164646565656536343334 Jan 14 01:41:52.405000 audit: BPF prog-id=219 op=LOAD Jan 14 01:41:52.405000 audit[4519]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4507 pid=4519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:52.405000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235363433306534656236663565643835346164646565656536343334 Jan 14 01:41:52.405000 audit: BPF prog-id=219 op=UNLOAD Jan 14 01:41:52.405000 audit[4519]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4507 pid=4519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:52.405000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235363433306534656236663565643835346164646565656536343334 Jan 14 01:41:52.405000 audit: BPF prog-id=218 op=UNLOAD Jan 14 01:41:52.405000 audit[4519]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4507 pid=4519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:52.405000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235363433306534656236663565643835346164646565656536343334 Jan 14 01:41:52.405000 audit: BPF prog-id=220 op=LOAD Jan 14 01:41:52.405000 audit[4519]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4507 pid=4519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:52.405000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235363433306534656236663565643835346164646565656536343334 Jan 14 01:41:52.432330 containerd[1676]: time="2026-01-14T01:41:52.432284046Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-675898b8d4-fz6xg,Uid:ac329ad1-edb3-4891-9a01-4d5e568d082e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b56430e4eb6f5ed854addeeee6434a0e61ab6bd782ce23271a875767c5f7e451\"" Jan 14 01:41:52.438112 containerd[1676]: time="2026-01-14T01:41:52.438057300Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:41:52.746182 containerd[1676]: time="2026-01-14T01:41:52.745883112Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:41:52.749504 containerd[1676]: time="2026-01-14T01:41:52.749457001Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:41:52.749853 containerd[1676]: time="2026-01-14T01:41:52.749588562Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:41:52.750151 kubelet[2905]: E0114 01:41:52.750075 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:41:52.750151 kubelet[2905]: E0114 01:41:52.750148 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:41:52.750445 kubelet[2905]: E0114 01:41:52.750275 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9blv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-675898b8d4-fz6xg_calico-apiserver(ac329ad1-edb3-4891-9a01-4d5e568d082e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:41:52.751583 kubelet[2905]: E0114 01:41:52.751548 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" podUID="ac329ad1-edb3-4891-9a01-4d5e568d082e" Jan 14 01:41:53.230054 containerd[1676]: time="2026-01-14T01:41:53.229990007Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5gt6s,Uid:7650f1f8-3708-460a-ab38-5c398404750b,Namespace:kube-system,Attempt:0,}" Jan 14 01:41:53.325791 systemd-networkd[1586]: calic0ed8ef7e4e: Link UP Jan 14 01:41:53.326293 systemd-networkd[1586]: calic0ed8ef7e4e: Gained carrier Jan 14 01:41:53.340493 containerd[1676]: 2026-01-14 01:41:53.264 [INFO][4547] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--5gt6s-eth0 coredns-674b8bbfcf- kube-system 7650f1f8-3708-460a-ab38-5c398404750b 825 0 2026-01-14 01:41:12 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4578-0-0-p-96753e66ce coredns-674b8bbfcf-5gt6s eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calic0ed8ef7e4e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" Namespace="kube-system" Pod="coredns-674b8bbfcf-5gt6s" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--5gt6s-" Jan 14 01:41:53.340493 containerd[1676]: 2026-01-14 01:41:53.264 [INFO][4547] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" Namespace="kube-system" Pod="coredns-674b8bbfcf-5gt6s" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--5gt6s-eth0" Jan 14 01:41:53.340493 containerd[1676]: 2026-01-14 01:41:53.285 [INFO][4562] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" HandleID="k8s-pod-network.728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" Workload="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--5gt6s-eth0" Jan 14 01:41:53.340683 containerd[1676]: 2026-01-14 01:41:53.285 [INFO][4562] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" HandleID="k8s-pod-network.728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" Workload="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--5gt6s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40005a6a40), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4578-0-0-p-96753e66ce", "pod":"coredns-674b8bbfcf-5gt6s", "timestamp":"2026-01-14 01:41:53.285181545 +0000 UTC"}, Hostname:"ci-4578-0-0-p-96753e66ce", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:41:53.340683 containerd[1676]: 2026-01-14 01:41:53.285 [INFO][4562] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:41:53.340683 containerd[1676]: 2026-01-14 01:41:53.285 [INFO][4562] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:41:53.340683 containerd[1676]: 2026-01-14 01:41:53.285 [INFO][4562] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-96753e66ce' Jan 14 01:41:53.340683 containerd[1676]: 2026-01-14 01:41:53.295 [INFO][4562] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:53.340683 containerd[1676]: 2026-01-14 01:41:53.299 [INFO][4562] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:53.340683 containerd[1676]: 2026-01-14 01:41:53.304 [INFO][4562] ipam/ipam.go 511: Trying affinity for 192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:53.340683 containerd[1676]: 2026-01-14 01:41:53.306 [INFO][4562] ipam/ipam.go 158: Attempting to load block cidr=192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:53.340683 containerd[1676]: 2026-01-14 01:41:53.308 [INFO][4562] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:53.340887 containerd[1676]: 2026-01-14 01:41:53.308 [INFO][4562] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.20.0/26 handle="k8s-pod-network.728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:53.340887 containerd[1676]: 2026-01-14 01:41:53.310 [INFO][4562] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f Jan 14 01:41:53.340887 containerd[1676]: 2026-01-14 01:41:53.314 [INFO][4562] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.20.0/26 handle="k8s-pod-network.728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:53.340887 containerd[1676]: 2026-01-14 01:41:53.320 [INFO][4562] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.20.3/26] block=192.168.20.0/26 handle="k8s-pod-network.728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:53.340887 containerd[1676]: 2026-01-14 01:41:53.320 [INFO][4562] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.20.3/26] handle="k8s-pod-network.728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:53.340887 containerd[1676]: 2026-01-14 01:41:53.321 [INFO][4562] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:41:53.340887 containerd[1676]: 2026-01-14 01:41:53.321 [INFO][4562] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.20.3/26] IPv6=[] ContainerID="728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" HandleID="k8s-pod-network.728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" Workload="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--5gt6s-eth0" Jan 14 01:41:53.341015 containerd[1676]: 2026-01-14 01:41:53.323 [INFO][4547] cni-plugin/k8s.go 418: Populated endpoint ContainerID="728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" Namespace="kube-system" Pod="coredns-674b8bbfcf-5gt6s" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--5gt6s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--5gt6s-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"7650f1f8-3708-460a-ab38-5c398404750b", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 41, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-96753e66ce", ContainerID:"", Pod:"coredns-674b8bbfcf-5gt6s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.20.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic0ed8ef7e4e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:41:53.341015 containerd[1676]: 2026-01-14 01:41:53.323 [INFO][4547] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.20.3/32] ContainerID="728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" Namespace="kube-system" Pod="coredns-674b8bbfcf-5gt6s" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--5gt6s-eth0" Jan 14 01:41:53.341015 containerd[1676]: 2026-01-14 01:41:53.323 [INFO][4547] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic0ed8ef7e4e ContainerID="728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" Namespace="kube-system" Pod="coredns-674b8bbfcf-5gt6s" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--5gt6s-eth0" Jan 14 01:41:53.341015 containerd[1676]: 2026-01-14 01:41:53.326 [INFO][4547] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" Namespace="kube-system" Pod="coredns-674b8bbfcf-5gt6s" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--5gt6s-eth0" Jan 14 01:41:53.341015 containerd[1676]: 2026-01-14 01:41:53.328 [INFO][4547] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" Namespace="kube-system" Pod="coredns-674b8bbfcf-5gt6s" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--5gt6s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--5gt6s-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"7650f1f8-3708-460a-ab38-5c398404750b", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 41, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-96753e66ce", ContainerID:"728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f", Pod:"coredns-674b8bbfcf-5gt6s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.20.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic0ed8ef7e4e", MAC:"e2:9a:aa:5e:37:d2", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:41:53.341015 containerd[1676]: 2026-01-14 01:41:53.337 [INFO][4547] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" Namespace="kube-system" Pod="coredns-674b8bbfcf-5gt6s" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--5gt6s-eth0" Jan 14 01:41:53.346944 systemd-networkd[1586]: vxlan.calico: Gained IPv6LL Jan 14 01:41:53.354000 audit[4580]: NETFILTER_CFG table=filter:128 family=2 entries=52 op=nft_register_chain pid=4580 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:41:53.354000 audit[4580]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=26592 a0=3 a1=ffffd7ab3de0 a2=0 a3=ffffaa1acfa8 items=0 ppid=4344 pid=4580 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.354000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:41:53.357766 kernel: kauditd_printk_skb: 262 callbacks suppressed Jan 14 01:41:53.357846 kernel: audit: type=1325 audit(1768354913.354:674): table=filter:128 family=2 entries=52 op=nft_register_chain pid=4580 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:41:53.357897 kernel: audit: type=1300 audit(1768354913.354:674): arch=c00000b7 syscall=211 success=yes exit=26592 a0=3 a1=ffffd7ab3de0 a2=0 a3=ffffaa1acfa8 items=0 ppid=4344 pid=4580 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.357914 kernel: audit: type=1327 audit(1768354913.354:674): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:41:53.363963 kubelet[2905]: E0114 01:41:53.358046 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" podUID="ac329ad1-edb3-4891-9a01-4d5e568d082e" Jan 14 01:41:53.394377 containerd[1676]: time="2026-01-14T01:41:53.394191458Z" level=info msg="connecting to shim 728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f" address="unix:///run/containerd/s/d47fa381ee48a2067b406b2ca067b2cff7de30d8c2f02874d3bb6fa98bbe18f0" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:41:53.393000 audit[4593]: NETFILTER_CFG table=filter:129 family=2 entries=20 op=nft_register_rule pid=4593 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:53.397777 kernel: audit: type=1325 audit(1768354913.393:675): table=filter:129 family=2 entries=20 op=nft_register_rule pid=4593 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:53.397826 kernel: audit: type=1300 audit(1768354913.393:675): arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffc1df5170 a2=0 a3=1 items=0 ppid=3072 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.393000 audit[4593]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffc1df5170 a2=0 a3=1 items=0 ppid=3072 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.393000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:53.403454 kernel: audit: type=1327 audit(1768354913.393:675): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:53.403513 kernel: audit: type=1325 audit(1768354913.398:676): table=nat:130 family=2 entries=14 op=nft_register_rule pid=4593 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:53.398000 audit[4593]: NETFILTER_CFG table=nat:130 family=2 entries=14 op=nft_register_rule pid=4593 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:53.398000 audit[4593]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffc1df5170 a2=0 a3=1 items=0 ppid=3072 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.409155 kernel: audit: type=1300 audit(1768354913.398:676): arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffc1df5170 a2=0 a3=1 items=0 ppid=3072 pid=4593 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.398000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:53.411065 kernel: audit: type=1327 audit(1768354913.398:676): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:53.423186 systemd[1]: Started cri-containerd-728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f.scope - libcontainer container 728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f. Jan 14 01:41:53.433000 audit: BPF prog-id=221 op=LOAD Jan 14 01:41:53.433000 audit: BPF prog-id=222 op=LOAD Jan 14 01:41:53.433000 audit[4604]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128180 a2=98 a3=0 items=0 ppid=4592 pid=4604 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.433000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732383233333135333338346339343563666561323063343065393834 Jan 14 01:41:53.434000 audit: BPF prog-id=222 op=UNLOAD Jan 14 01:41:53.434000 audit[4604]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4592 pid=4604 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.434000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732383233333135333338346339343563666561323063343065393834 Jan 14 01:41:53.434000 audit: BPF prog-id=223 op=LOAD Jan 14 01:41:53.434000 audit[4604]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001283e8 a2=98 a3=0 items=0 ppid=4592 pid=4604 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.434000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732383233333135333338346339343563666561323063343065393834 Jan 14 01:41:53.434000 audit: BPF prog-id=224 op=LOAD Jan 14 01:41:53.434000 audit[4604]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000128168 a2=98 a3=0 items=0 ppid=4592 pid=4604 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.434000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732383233333135333338346339343563666561323063343065393834 Jan 14 01:41:53.434000 audit: BPF prog-id=224 op=UNLOAD Jan 14 01:41:53.434000 audit[4604]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4592 pid=4604 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.434000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732383233333135333338346339343563666561323063343065393834 Jan 14 01:41:53.434000 audit: BPF prog-id=223 op=UNLOAD Jan 14 01:41:53.434000 audit[4604]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4592 pid=4604 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.434000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732383233333135333338346339343563666561323063343065393834 Jan 14 01:41:53.434000 audit: BPF prog-id=225 op=LOAD Jan 14 01:41:53.434000 audit[4604]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128648 a2=98 a3=0 items=0 ppid=4592 pid=4604 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.434000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3732383233333135333338346339343563666561323063343065393834 Jan 14 01:41:53.436730 kernel: audit: type=1334 audit(1768354913.433:677): prog-id=221 op=LOAD Jan 14 01:41:53.458343 containerd[1676]: time="2026-01-14T01:41:53.458283619Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5gt6s,Uid:7650f1f8-3708-460a-ab38-5c398404750b,Namespace:kube-system,Attempt:0,} returns sandbox id \"728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f\"" Jan 14 01:41:53.464575 containerd[1676]: time="2026-01-14T01:41:53.464540915Z" level=info msg="CreateContainer within sandbox \"728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 14 01:41:53.477530 containerd[1676]: time="2026-01-14T01:41:53.476946026Z" level=info msg="Container 79a7e6a0552001445fff37c9d8948335b53343e30ac08447010d48b5976ffcf2: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:41:53.483945 containerd[1676]: time="2026-01-14T01:41:53.483838323Z" level=info msg="CreateContainer within sandbox \"728233153384c945cfea20c40e98450349feda6c8422c4e27633595e26712e1f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"79a7e6a0552001445fff37c9d8948335b53343e30ac08447010d48b5976ffcf2\"" Jan 14 01:41:53.484807 containerd[1676]: time="2026-01-14T01:41:53.484772806Z" level=info msg="StartContainer for \"79a7e6a0552001445fff37c9d8948335b53343e30ac08447010d48b5976ffcf2\"" Jan 14 01:41:53.486029 containerd[1676]: time="2026-01-14T01:41:53.485927328Z" level=info msg="connecting to shim 79a7e6a0552001445fff37c9d8948335b53343e30ac08447010d48b5976ffcf2" address="unix:///run/containerd/s/d47fa381ee48a2067b406b2ca067b2cff7de30d8c2f02874d3bb6fa98bbe18f0" protocol=ttrpc version=3 Jan 14 01:41:53.508102 systemd[1]: Started cri-containerd-79a7e6a0552001445fff37c9d8948335b53343e30ac08447010d48b5976ffcf2.scope - libcontainer container 79a7e6a0552001445fff37c9d8948335b53343e30ac08447010d48b5976ffcf2. Jan 14 01:41:53.516000 audit: BPF prog-id=226 op=LOAD Jan 14 01:41:53.516000 audit: BPF prog-id=227 op=LOAD Jan 14 01:41:53.516000 audit[4629]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4592 pid=4629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.516000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739613765366130353532303031343435666666333763396438393438 Jan 14 01:41:53.516000 audit: BPF prog-id=227 op=UNLOAD Jan 14 01:41:53.516000 audit[4629]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4592 pid=4629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.516000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739613765366130353532303031343435666666333763396438393438 Jan 14 01:41:53.516000 audit: BPF prog-id=228 op=LOAD Jan 14 01:41:53.516000 audit[4629]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4592 pid=4629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.516000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739613765366130353532303031343435666666333763396438393438 Jan 14 01:41:53.516000 audit: BPF prog-id=229 op=LOAD Jan 14 01:41:53.516000 audit[4629]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4592 pid=4629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.516000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739613765366130353532303031343435666666333763396438393438 Jan 14 01:41:53.516000 audit: BPF prog-id=229 op=UNLOAD Jan 14 01:41:53.516000 audit[4629]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4592 pid=4629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.516000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739613765366130353532303031343435666666333763396438393438 Jan 14 01:41:53.516000 audit: BPF prog-id=228 op=UNLOAD Jan 14 01:41:53.516000 audit[4629]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4592 pid=4629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.516000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739613765366130353532303031343435666666333763396438393438 Jan 14 01:41:53.516000 audit: BPF prog-id=230 op=LOAD Jan 14 01:41:53.516000 audit[4629]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4592 pid=4629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:53.516000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739613765366130353532303031343435666666333763396438393438 Jan 14 01:41:53.533094 containerd[1676]: time="2026-01-14T01:41:53.533044327Z" level=info msg="StartContainer for \"79a7e6a0552001445fff37c9d8948335b53343e30ac08447010d48b5976ffcf2\" returns successfully" Jan 14 01:41:54.229386 containerd[1676]: time="2026-01-14T01:41:54.229300553Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-675898b8d4-dphrr,Uid:adf9db04-ef07-4e4b-ac7b-0a044973dca8,Namespace:calico-apiserver,Attempt:0,}" Jan 14 01:41:54.324044 systemd-networkd[1586]: califca547949ad: Link UP Jan 14 01:41:54.324760 systemd-networkd[1586]: califca547949ad: Gained carrier Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.263 [INFO][4664] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--dphrr-eth0 calico-apiserver-675898b8d4- calico-apiserver adf9db04-ef07-4e4b-ac7b-0a044973dca8 829 0 2026-01-14 01:41:22 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:675898b8d4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4578-0-0-p-96753e66ce calico-apiserver-675898b8d4-dphrr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] califca547949ad [] [] }} ContainerID="4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" Namespace="calico-apiserver" Pod="calico-apiserver-675898b8d4-dphrr" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--dphrr-" Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.263 [INFO][4664] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" Namespace="calico-apiserver" Pod="calico-apiserver-675898b8d4-dphrr" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--dphrr-eth0" Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.284 [INFO][4679] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" HandleID="k8s-pod-network.4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" Workload="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--dphrr-eth0" Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.284 [INFO][4679] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" HandleID="k8s-pod-network.4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" Workload="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--dphrr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c7d0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4578-0-0-p-96753e66ce", "pod":"calico-apiserver-675898b8d4-dphrr", "timestamp":"2026-01-14 01:41:54.284410851 +0000 UTC"}, Hostname:"ci-4578-0-0-p-96753e66ce", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.284 [INFO][4679] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.284 [INFO][4679] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.284 [INFO][4679] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-96753e66ce' Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.294 [INFO][4679] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.299 [INFO][4679] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.303 [INFO][4679] ipam/ipam.go 511: Trying affinity for 192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.305 [INFO][4679] ipam/ipam.go 158: Attempting to load block cidr=192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.307 [INFO][4679] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.307 [INFO][4679] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.20.0/26 handle="k8s-pod-network.4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.309 [INFO][4679] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9 Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.314 [INFO][4679] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.20.0/26 handle="k8s-pod-network.4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.319 [INFO][4679] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.20.4/26] block=192.168.20.0/26 handle="k8s-pod-network.4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.319 [INFO][4679] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.20.4/26] handle="k8s-pod-network.4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.319 [INFO][4679] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:41:54.338127 containerd[1676]: 2026-01-14 01:41:54.319 [INFO][4679] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.20.4/26] IPv6=[] ContainerID="4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" HandleID="k8s-pod-network.4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" Workload="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--dphrr-eth0" Jan 14 01:41:54.338832 containerd[1676]: 2026-01-14 01:41:54.321 [INFO][4664] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" Namespace="calico-apiserver" Pod="calico-apiserver-675898b8d4-dphrr" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--dphrr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--dphrr-eth0", GenerateName:"calico-apiserver-675898b8d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"adf9db04-ef07-4e4b-ac7b-0a044973dca8", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 41, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"675898b8d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-96753e66ce", ContainerID:"", Pod:"calico-apiserver-675898b8d4-dphrr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.20.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califca547949ad", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:41:54.338832 containerd[1676]: 2026-01-14 01:41:54.321 [INFO][4664] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.20.4/32] ContainerID="4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" Namespace="calico-apiserver" Pod="calico-apiserver-675898b8d4-dphrr" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--dphrr-eth0" Jan 14 01:41:54.338832 containerd[1676]: 2026-01-14 01:41:54.321 [INFO][4664] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califca547949ad ContainerID="4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" Namespace="calico-apiserver" Pod="calico-apiserver-675898b8d4-dphrr" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--dphrr-eth0" Jan 14 01:41:54.338832 containerd[1676]: 2026-01-14 01:41:54.325 [INFO][4664] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" Namespace="calico-apiserver" Pod="calico-apiserver-675898b8d4-dphrr" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--dphrr-eth0" Jan 14 01:41:54.338832 containerd[1676]: 2026-01-14 01:41:54.325 [INFO][4664] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" Namespace="calico-apiserver" Pod="calico-apiserver-675898b8d4-dphrr" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--dphrr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--dphrr-eth0", GenerateName:"calico-apiserver-675898b8d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"adf9db04-ef07-4e4b-ac7b-0a044973dca8", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 41, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"675898b8d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-96753e66ce", ContainerID:"4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9", Pod:"calico-apiserver-675898b8d4-dphrr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.20.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califca547949ad", MAC:"a2:f3:ff:71:6b:47", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:41:54.338832 containerd[1676]: 2026-01-14 01:41:54.336 [INFO][4664] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" Namespace="calico-apiserver" Pod="calico-apiserver-675898b8d4-dphrr" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--apiserver--675898b8d4--dphrr-eth0" Jan 14 01:41:54.346000 audit[4697]: NETFILTER_CFG table=filter:131 family=2 entries=41 op=nft_register_chain pid=4697 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:41:54.346000 audit[4697]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=23060 a0=3 a1=fffffc68e2a0 a2=0 a3=ffff8be26fa8 items=0 ppid=4344 pid=4697 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:54.346000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:41:54.361076 containerd[1676]: time="2026-01-14T01:41:54.360896643Z" level=info msg="connecting to shim 4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9" address="unix:///run/containerd/s/ce342081dc694c2a70c9a50c580e05efdaf278af3dc82e77b918936f5c384005" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:41:54.361654 kubelet[2905]: E0114 01:41:54.361611 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" podUID="ac329ad1-edb3-4891-9a01-4d5e568d082e" Jan 14 01:41:54.370876 systemd-networkd[1586]: calid631348a93b: Gained IPv6LL Jan 14 01:41:54.407029 systemd[1]: Started cri-containerd-4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9.scope - libcontainer container 4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9. Jan 14 01:41:54.412000 audit[4735]: NETFILTER_CFG table=filter:132 family=2 entries=20 op=nft_register_rule pid=4735 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:54.412000 audit[4735]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffdd19bc30 a2=0 a3=1 items=0 ppid=3072 pid=4735 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:54.412000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:54.417000 audit: BPF prog-id=231 op=LOAD Jan 14 01:41:54.418000 audit: BPF prog-id=232 op=LOAD Jan 14 01:41:54.418000 audit[4716]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=4706 pid=4716 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:54.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462653832633337663736656365613935313835633431643038653862 Jan 14 01:41:54.418000 audit: BPF prog-id=232 op=UNLOAD Jan 14 01:41:54.418000 audit[4716]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4706 pid=4716 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:54.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462653832633337663736656365613935313835633431643038653862 Jan 14 01:41:54.418000 audit[4735]: NETFILTER_CFG table=nat:133 family=2 entries=14 op=nft_register_rule pid=4735 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:54.418000 audit[4735]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffdd19bc30 a2=0 a3=1 items=0 ppid=3072 pid=4735 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:54.418000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:54.418000 audit: BPF prog-id=233 op=LOAD Jan 14 01:41:54.418000 audit[4716]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=4706 pid=4716 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:54.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462653832633337663736656365613935313835633431643038653862 Jan 14 01:41:54.418000 audit: BPF prog-id=234 op=LOAD Jan 14 01:41:54.418000 audit[4716]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=4706 pid=4716 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:54.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462653832633337663736656365613935313835633431643038653862 Jan 14 01:41:54.418000 audit: BPF prog-id=234 op=UNLOAD Jan 14 01:41:54.418000 audit[4716]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4706 pid=4716 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:54.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462653832633337663736656365613935313835633431643038653862 Jan 14 01:41:54.418000 audit: BPF prog-id=233 op=UNLOAD Jan 14 01:41:54.418000 audit[4716]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4706 pid=4716 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:54.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462653832633337663736656365613935313835633431643038653862 Jan 14 01:41:54.418000 audit: BPF prog-id=235 op=LOAD Jan 14 01:41:54.418000 audit[4716]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=4706 pid=4716 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:54.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462653832633337663736656365613935313835633431643038653862 Jan 14 01:41:54.445698 containerd[1676]: time="2026-01-14T01:41:54.445643976Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-675898b8d4-dphrr,Uid:adf9db04-ef07-4e4b-ac7b-0a044973dca8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"4be82c37f76ecea95185c41d08e8b2112b4ea9e0a6cc4615bc2c93888de186b9\"" Jan 14 01:41:54.447704 containerd[1676]: time="2026-01-14T01:41:54.447674061Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:41:54.780087 containerd[1676]: time="2026-01-14T01:41:54.780017534Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:41:54.781613 containerd[1676]: time="2026-01-14T01:41:54.781580498Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:41:54.781693 containerd[1676]: time="2026-01-14T01:41:54.781657259Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:41:54.781855 kubelet[2905]: E0114 01:41:54.781808 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:41:54.781908 kubelet[2905]: E0114 01:41:54.781866 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:41:54.782877 kubelet[2905]: E0114 01:41:54.781992 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fjtvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-675898b8d4-dphrr_calico-apiserver(adf9db04-ef07-4e4b-ac7b-0a044973dca8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:41:54.783230 kubelet[2905]: E0114 01:41:54.783193 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" podUID="adf9db04-ef07-4e4b-ac7b-0a044973dca8" Jan 14 01:41:55.139023 systemd-networkd[1586]: calic0ed8ef7e4e: Gained IPv6LL Jan 14 01:41:55.230267 containerd[1676]: time="2026-01-14T01:41:55.230224504Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-twlzn,Uid:1b8220b4-811d-4471-95d7-cea88df93438,Namespace:calico-system,Attempt:0,}" Jan 14 01:41:55.340524 systemd-networkd[1586]: calie5e21ee2b2c: Link UP Jan 14 01:41:55.341014 systemd-networkd[1586]: calie5e21ee2b2c: Gained carrier Jan 14 01:41:55.353178 kubelet[2905]: I0114 01:41:55.353111 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-5gt6s" podStartSLOduration=43.353095332 podStartE2EDuration="43.353095332s" podCreationTimestamp="2026-01-14 01:41:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:41:54.402350787 +0000 UTC m=+49.275836862" watchObservedRunningTime="2026-01-14 01:41:55.353095332 +0000 UTC m=+50.226581407" Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.265 [INFO][4745] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--96753e66ce-k8s-csi--node--driver--twlzn-eth0 csi-node-driver- calico-system 1b8220b4-811d-4471-95d7-cea88df93438 719 0 2026-01-14 01:41:29 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4578-0-0-p-96753e66ce csi-node-driver-twlzn eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calie5e21ee2b2c [] [] }} ContainerID="0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" Namespace="calico-system" Pod="csi-node-driver-twlzn" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-csi--node--driver--twlzn-" Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.265 [INFO][4745] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" Namespace="calico-system" Pod="csi-node-driver-twlzn" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-csi--node--driver--twlzn-eth0" Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.294 [INFO][4758] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" HandleID="k8s-pod-network.0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" Workload="ci--4578--0--0--p--96753e66ce-k8s-csi--node--driver--twlzn-eth0" Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.294 [INFO][4758] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" HandleID="k8s-pod-network.0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" Workload="ci--4578--0--0--p--96753e66ce-k8s-csi--node--driver--twlzn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004ce10), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4578-0-0-p-96753e66ce", "pod":"csi-node-driver-twlzn", "timestamp":"2026-01-14 01:41:55.294053744 +0000 UTC"}, Hostname:"ci-4578-0-0-p-96753e66ce", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.294 [INFO][4758] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.294 [INFO][4758] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.294 [INFO][4758] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-96753e66ce' Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.304 [INFO][4758] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.308 [INFO][4758] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.313 [INFO][4758] ipam/ipam.go 511: Trying affinity for 192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.315 [INFO][4758] ipam/ipam.go 158: Attempting to load block cidr=192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.318 [INFO][4758] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.318 [INFO][4758] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.20.0/26 handle="k8s-pod-network.0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.320 [INFO][4758] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.323 [INFO][4758] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.20.0/26 handle="k8s-pod-network.0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.336 [INFO][4758] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.20.5/26] block=192.168.20.0/26 handle="k8s-pod-network.0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.336 [INFO][4758] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.20.5/26] handle="k8s-pod-network.0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.336 [INFO][4758] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:41:55.356563 containerd[1676]: 2026-01-14 01:41:55.336 [INFO][4758] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.20.5/26] IPv6=[] ContainerID="0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" HandleID="k8s-pod-network.0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" Workload="ci--4578--0--0--p--96753e66ce-k8s-csi--node--driver--twlzn-eth0" Jan 14 01:41:55.357275 containerd[1676]: 2026-01-14 01:41:55.338 [INFO][4745] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" Namespace="calico-system" Pod="csi-node-driver-twlzn" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-csi--node--driver--twlzn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--96753e66ce-k8s-csi--node--driver--twlzn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"1b8220b4-811d-4471-95d7-cea88df93438", ResourceVersion:"719", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 41, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-96753e66ce", ContainerID:"", Pod:"csi-node-driver-twlzn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.20.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie5e21ee2b2c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:41:55.357275 containerd[1676]: 2026-01-14 01:41:55.338 [INFO][4745] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.20.5/32] ContainerID="0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" Namespace="calico-system" Pod="csi-node-driver-twlzn" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-csi--node--driver--twlzn-eth0" Jan 14 01:41:55.357275 containerd[1676]: 2026-01-14 01:41:55.338 [INFO][4745] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie5e21ee2b2c ContainerID="0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" Namespace="calico-system" Pod="csi-node-driver-twlzn" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-csi--node--driver--twlzn-eth0" Jan 14 01:41:55.357275 containerd[1676]: 2026-01-14 01:41:55.340 [INFO][4745] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" Namespace="calico-system" Pod="csi-node-driver-twlzn" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-csi--node--driver--twlzn-eth0" Jan 14 01:41:55.357275 containerd[1676]: 2026-01-14 01:41:55.341 [INFO][4745] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" Namespace="calico-system" Pod="csi-node-driver-twlzn" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-csi--node--driver--twlzn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--96753e66ce-k8s-csi--node--driver--twlzn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"1b8220b4-811d-4471-95d7-cea88df93438", ResourceVersion:"719", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 41, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-96753e66ce", ContainerID:"0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba", Pod:"csi-node-driver-twlzn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.20.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie5e21ee2b2c", MAC:"6e:dc:7f:37:81:5d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:41:55.357275 containerd[1676]: 2026-01-14 01:41:55.353 [INFO][4745] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" Namespace="calico-system" Pod="csi-node-driver-twlzn" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-csi--node--driver--twlzn-eth0" Jan 14 01:41:55.368508 kubelet[2905]: E0114 01:41:55.367798 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" podUID="adf9db04-ef07-4e4b-ac7b-0a044973dca8" Jan 14 01:41:55.369000 audit[4778]: NETFILTER_CFG table=filter:134 family=2 entries=44 op=nft_register_chain pid=4778 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:41:55.369000 audit[4778]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=21936 a0=3 a1=ffffd1f2bfa0 a2=0 a3=ffff95125fa8 items=0 ppid=4344 pid=4778 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:55.369000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:41:55.389489 containerd[1676]: time="2026-01-14T01:41:55.389349663Z" level=info msg="connecting to shim 0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba" address="unix:///run/containerd/s/ecbe558090f12a344ffda8ded5107f1eec1d6e0130a6527652d8bd549e7668c4" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:41:55.394974 systemd-networkd[1586]: califca547949ad: Gained IPv6LL Jan 14 01:41:55.407000 audit[4807]: NETFILTER_CFG table=filter:135 family=2 entries=20 op=nft_register_rule pid=4807 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:55.407000 audit[4807]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffd4cacef0 a2=0 a3=1 items=0 ppid=3072 pid=4807 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:55.407000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:55.412000 audit[4807]: NETFILTER_CFG table=nat:136 family=2 entries=14 op=nft_register_rule pid=4807 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:55.412000 audit[4807]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffd4cacef0 a2=0 a3=1 items=0 ppid=3072 pid=4807 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:55.412000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:55.415950 systemd[1]: Started cri-containerd-0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba.scope - libcontainer container 0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba. Jan 14 01:41:55.423000 audit[4820]: NETFILTER_CFG table=filter:137 family=2 entries=17 op=nft_register_rule pid=4820 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:55.423000 audit[4820]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffcc5ebf00 a2=0 a3=1 items=0 ppid=3072 pid=4820 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:55.423000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:55.425000 audit: BPF prog-id=236 op=LOAD Jan 14 01:41:55.425000 audit: BPF prog-id=237 op=LOAD Jan 14 01:41:55.425000 audit[4799]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4787 pid=4799 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:55.425000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3061663931383034616235323265323035376633376233643834346231 Jan 14 01:41:55.426000 audit: BPF prog-id=237 op=UNLOAD Jan 14 01:41:55.426000 audit[4799]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4787 pid=4799 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:55.426000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3061663931383034616235323265323035376633376233643834346231 Jan 14 01:41:55.426000 audit: BPF prog-id=238 op=LOAD Jan 14 01:41:55.426000 audit[4799]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4787 pid=4799 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:55.426000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3061663931383034616235323265323035376633376233643834346231 Jan 14 01:41:55.426000 audit: BPF prog-id=239 op=LOAD Jan 14 01:41:55.426000 audit[4799]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4787 pid=4799 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:55.426000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3061663931383034616235323265323035376633376233643834346231 Jan 14 01:41:55.426000 audit: BPF prog-id=239 op=UNLOAD Jan 14 01:41:55.426000 audit[4799]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4787 pid=4799 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:55.426000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3061663931383034616235323265323035376633376233643834346231 Jan 14 01:41:55.426000 audit: BPF prog-id=238 op=UNLOAD Jan 14 01:41:55.426000 audit[4799]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4787 pid=4799 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:55.426000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3061663931383034616235323265323035376633376233643834346231 Jan 14 01:41:55.426000 audit: BPF prog-id=240 op=LOAD Jan 14 01:41:55.426000 audit[4799]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4787 pid=4799 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:55.426000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3061663931383034616235323265323035376633376233643834346231 Jan 14 01:41:55.427000 audit[4820]: NETFILTER_CFG table=nat:138 family=2 entries=35 op=nft_register_chain pid=4820 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:55.427000 audit[4820]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=ffffcc5ebf00 a2=0 a3=1 items=0 ppid=3072 pid=4820 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:55.427000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:55.441967 containerd[1676]: time="2026-01-14T01:41:55.441920715Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-twlzn,Uid:1b8220b4-811d-4471-95d7-cea88df93438,Namespace:calico-system,Attempt:0,} returns sandbox id \"0af91804ab522e2057f37b3d844b121f2f4ee6f3c8515e3d551ce6810bec82ba\"" Jan 14 01:41:55.444412 containerd[1676]: time="2026-01-14T01:41:55.444389321Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:41:55.767155 containerd[1676]: time="2026-01-14T01:41:55.767043410Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:41:55.768232 containerd[1676]: time="2026-01-14T01:41:55.768197533Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:41:55.768232 containerd[1676]: time="2026-01-14T01:41:55.768260853Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:41:55.768488 kubelet[2905]: E0114 01:41:55.768448 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:41:55.768557 kubelet[2905]: E0114 01:41:55.768499 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:41:55.768689 kubelet[2905]: E0114 01:41:55.768621 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bz2bm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-twlzn_calico-system(1b8220b4-811d-4471-95d7-cea88df93438): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:41:55.771082 containerd[1676]: time="2026-01-14T01:41:55.770961860Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:41:56.096258 containerd[1676]: time="2026-01-14T01:41:56.096117556Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:41:56.097570 containerd[1676]: time="2026-01-14T01:41:56.097529479Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:41:56.097762 containerd[1676]: time="2026-01-14T01:41:56.097553399Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:41:56.098131 kubelet[2905]: E0114 01:41:56.097899 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:41:56.098131 kubelet[2905]: E0114 01:41:56.097952 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:41:56.098131 kubelet[2905]: E0114 01:41:56.098077 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bz2bm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-twlzn_calico-system(1b8220b4-811d-4471-95d7-cea88df93438): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:41:56.099496 kubelet[2905]: E0114 01:41:56.099242 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:41:56.230591 containerd[1676]: time="2026-01-14T01:41:56.230544613Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b67dd7dbc-k5lcl,Uid:8f23c871-1821-4e53-80e3-947513960a4b,Namespace:calico-system,Attempt:0,}" Jan 14 01:41:56.230797 containerd[1676]: time="2026-01-14T01:41:56.230688933Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-fk5c6,Uid:4d0d03be-d2cc-4c75-ae9e-07c3631ee9eb,Namespace:kube-system,Attempt:0,}" Jan 14 01:41:56.230834 containerd[1676]: time="2026-01-14T01:41:56.230806613Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-jg7bs,Uid:e6d84f7a-c9f6-41d8-94ff-304c6e803e1e,Namespace:calico-system,Attempt:0,}" Jan 14 01:41:56.364891 systemd-networkd[1586]: cali711549e3cd5: Link UP Jan 14 01:41:56.366714 systemd-networkd[1586]: cali711549e3cd5: Gained carrier Jan 14 01:41:56.375668 kubelet[2905]: E0114 01:41:56.375470 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:41:56.375668 kubelet[2905]: E0114 01:41:56.375571 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" podUID="adf9db04-ef07-4e4b-ac7b-0a044973dca8" Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.291 [INFO][4831] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--96753e66ce-k8s-calico--kube--controllers--b67dd7dbc--k5lcl-eth0 calico-kube-controllers-b67dd7dbc- calico-system 8f23c871-1821-4e53-80e3-947513960a4b 824 0 2026-01-14 01:41:29 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:b67dd7dbc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4578-0-0-p-96753e66ce calico-kube-controllers-b67dd7dbc-k5lcl eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali711549e3cd5 [] [] }} ContainerID="49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" Namespace="calico-system" Pod="calico-kube-controllers-b67dd7dbc-k5lcl" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--kube--controllers--b67dd7dbc--k5lcl-" Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.292 [INFO][4831] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" Namespace="calico-system" Pod="calico-kube-controllers-b67dd7dbc-k5lcl" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--kube--controllers--b67dd7dbc--k5lcl-eth0" Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.320 [INFO][4878] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" HandleID="k8s-pod-network.49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" Workload="ci--4578--0--0--p--96753e66ce-k8s-calico--kube--controllers--b67dd7dbc--k5lcl-eth0" Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.320 [INFO][4878] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" HandleID="k8s-pod-network.49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" Workload="ci--4578--0--0--p--96753e66ce-k8s-calico--kube--controllers--b67dd7dbc--k5lcl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004de40), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4578-0-0-p-96753e66ce", "pod":"calico-kube-controllers-b67dd7dbc-k5lcl", "timestamp":"2026-01-14 01:41:56.320586319 +0000 UTC"}, Hostname:"ci-4578-0-0-p-96753e66ce", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.320 [INFO][4878] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.320 [INFO][4878] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.320 [INFO][4878] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-96753e66ce' Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.331 [INFO][4878] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.335 [INFO][4878] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.341 [INFO][4878] ipam/ipam.go 511: Trying affinity for 192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.343 [INFO][4878] ipam/ipam.go 158: Attempting to load block cidr=192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.346 [INFO][4878] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.346 [INFO][4878] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.20.0/26 handle="k8s-pod-network.49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.348 [INFO][4878] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3 Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.352 [INFO][4878] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.20.0/26 handle="k8s-pod-network.49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.359 [INFO][4878] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.20.6/26] block=192.168.20.0/26 handle="k8s-pod-network.49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.359 [INFO][4878] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.20.6/26] handle="k8s-pod-network.49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.359 [INFO][4878] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:41:56.384449 containerd[1676]: 2026-01-14 01:41:56.359 [INFO][4878] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.20.6/26] IPv6=[] ContainerID="49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" HandleID="k8s-pod-network.49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" Workload="ci--4578--0--0--p--96753e66ce-k8s-calico--kube--controllers--b67dd7dbc--k5lcl-eth0" Jan 14 01:41:56.385900 containerd[1676]: 2026-01-14 01:41:56.362 [INFO][4831] cni-plugin/k8s.go 418: Populated endpoint ContainerID="49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" Namespace="calico-system" Pod="calico-kube-controllers-b67dd7dbc-k5lcl" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--kube--controllers--b67dd7dbc--k5lcl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--96753e66ce-k8s-calico--kube--controllers--b67dd7dbc--k5lcl-eth0", GenerateName:"calico-kube-controllers-b67dd7dbc-", Namespace:"calico-system", SelfLink:"", UID:"8f23c871-1821-4e53-80e3-947513960a4b", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 41, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"b67dd7dbc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-96753e66ce", ContainerID:"", Pod:"calico-kube-controllers-b67dd7dbc-k5lcl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.20.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali711549e3cd5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:41:56.385900 containerd[1676]: 2026-01-14 01:41:56.362 [INFO][4831] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.20.6/32] ContainerID="49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" Namespace="calico-system" Pod="calico-kube-controllers-b67dd7dbc-k5lcl" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--kube--controllers--b67dd7dbc--k5lcl-eth0" Jan 14 01:41:56.385900 containerd[1676]: 2026-01-14 01:41:56.362 [INFO][4831] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali711549e3cd5 ContainerID="49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" Namespace="calico-system" Pod="calico-kube-controllers-b67dd7dbc-k5lcl" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--kube--controllers--b67dd7dbc--k5lcl-eth0" Jan 14 01:41:56.385900 containerd[1676]: 2026-01-14 01:41:56.366 [INFO][4831] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" Namespace="calico-system" Pod="calico-kube-controllers-b67dd7dbc-k5lcl" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--kube--controllers--b67dd7dbc--k5lcl-eth0" Jan 14 01:41:56.385900 containerd[1676]: 2026-01-14 01:41:56.367 [INFO][4831] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" Namespace="calico-system" Pod="calico-kube-controllers-b67dd7dbc-k5lcl" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--kube--controllers--b67dd7dbc--k5lcl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--96753e66ce-k8s-calico--kube--controllers--b67dd7dbc--k5lcl-eth0", GenerateName:"calico-kube-controllers-b67dd7dbc-", Namespace:"calico-system", SelfLink:"", UID:"8f23c871-1821-4e53-80e3-947513960a4b", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 41, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"b67dd7dbc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-96753e66ce", ContainerID:"49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3", Pod:"calico-kube-controllers-b67dd7dbc-k5lcl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.20.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali711549e3cd5", MAC:"6e:90:68:3c:66:63", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:41:56.385900 containerd[1676]: 2026-01-14 01:41:56.382 [INFO][4831] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" Namespace="calico-system" Pod="calico-kube-controllers-b67dd7dbc-k5lcl" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-calico--kube--controllers--b67dd7dbc--k5lcl-eth0" Jan 14 01:41:56.404000 audit[4913]: NETFILTER_CFG table=filter:139 family=2 entries=48 op=nft_register_chain pid=4913 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:41:56.404000 audit[4913]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=23124 a0=3 a1=ffffdb4b5190 a2=0 a3=ffff7f782fa8 items=0 ppid=4344 pid=4913 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.404000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:41:56.417658 containerd[1676]: time="2026-01-14T01:41:56.417617522Z" level=info msg="connecting to shim 49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3" address="unix:///run/containerd/s/92fd78a69d8bd3fc7f554b75f3bcedf34f895149d1d9d3bceb6cce2ba705746a" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:41:56.441949 systemd[1]: Started cri-containerd-49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3.scope - libcontainer container 49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3. Jan 14 01:41:56.454000 audit: BPF prog-id=241 op=LOAD Jan 14 01:41:56.454000 audit: BPF prog-id=242 op=LOAD Jan 14 01:41:56.454000 audit[4933]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4922 pid=4933 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.454000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3439323439633564326265663138613466373664643862616264323834 Jan 14 01:41:56.454000 audit: BPF prog-id=242 op=UNLOAD Jan 14 01:41:56.454000 audit[4933]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4922 pid=4933 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.454000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3439323439633564326265663138613466373664643862616264323834 Jan 14 01:41:56.454000 audit: BPF prog-id=243 op=LOAD Jan 14 01:41:56.454000 audit[4933]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4922 pid=4933 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.454000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3439323439633564326265663138613466373664643862616264323834 Jan 14 01:41:56.454000 audit: BPF prog-id=244 op=LOAD Jan 14 01:41:56.454000 audit[4933]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4922 pid=4933 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.454000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3439323439633564326265663138613466373664643862616264323834 Jan 14 01:41:56.454000 audit: BPF prog-id=244 op=UNLOAD Jan 14 01:41:56.454000 audit[4933]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4922 pid=4933 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.454000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3439323439633564326265663138613466373664643862616264323834 Jan 14 01:41:56.454000 audit: BPF prog-id=243 op=UNLOAD Jan 14 01:41:56.454000 audit[4933]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4922 pid=4933 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.454000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3439323439633564326265663138613466373664643862616264323834 Jan 14 01:41:56.454000 audit: BPF prog-id=245 op=LOAD Jan 14 01:41:56.454000 audit[4933]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4922 pid=4933 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.454000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3439323439633564326265663138613466373664643862616264323834 Jan 14 01:41:56.473684 systemd-networkd[1586]: cali1337fcf62fd: Link UP Jan 14 01:41:56.474799 systemd-networkd[1586]: cali1337fcf62fd: Gained carrier Jan 14 01:41:56.490761 containerd[1676]: time="2026-01-14T01:41:56.490669705Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b67dd7dbc-k5lcl,Uid:8f23c871-1821-4e53-80e3-947513960a4b,Namespace:calico-system,Attempt:0,} returns sandbox id \"49249c5d2bef18a4f76dd8babd2847d70c2549494c93339a7f3d701fc7a34de3\"" Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.291 [INFO][4841] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--fk5c6-eth0 coredns-674b8bbfcf- kube-system 4d0d03be-d2cc-4c75-ae9e-07c3631ee9eb 826 0 2026-01-14 01:41:12 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4578-0-0-p-96753e66ce coredns-674b8bbfcf-fk5c6 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali1337fcf62fd [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" Namespace="kube-system" Pod="coredns-674b8bbfcf-fk5c6" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--fk5c6-" Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.291 [INFO][4841] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" Namespace="kube-system" Pod="coredns-674b8bbfcf-fk5c6" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--fk5c6-eth0" Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.324 [INFO][4876] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" HandleID="k8s-pod-network.c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" Workload="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--fk5c6-eth0" Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.324 [INFO][4876] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" HandleID="k8s-pod-network.c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" Workload="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--fk5c6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400058caa0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4578-0-0-p-96753e66ce", "pod":"coredns-674b8bbfcf-fk5c6", "timestamp":"2026-01-14 01:41:56.324259768 +0000 UTC"}, Hostname:"ci-4578-0-0-p-96753e66ce", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.324 [INFO][4876] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.359 [INFO][4876] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.359 [INFO][4876] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-96753e66ce' Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.432 [INFO][4876] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.439 [INFO][4876] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.445 [INFO][4876] ipam/ipam.go 511: Trying affinity for 192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.448 [INFO][4876] ipam/ipam.go 158: Attempting to load block cidr=192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.452 [INFO][4876] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.452 [INFO][4876] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.20.0/26 handle="k8s-pod-network.c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.454 [INFO][4876] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.460 [INFO][4876] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.20.0/26 handle="k8s-pod-network.c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.469 [INFO][4876] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.20.7/26] block=192.168.20.0/26 handle="k8s-pod-network.c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.469 [INFO][4876] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.20.7/26] handle="k8s-pod-network.c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.469 [INFO][4876] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:41:56.492254 containerd[1676]: 2026-01-14 01:41:56.469 [INFO][4876] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.20.7/26] IPv6=[] ContainerID="c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" HandleID="k8s-pod-network.c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" Workload="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--fk5c6-eth0" Jan 14 01:41:56.492908 containerd[1676]: 2026-01-14 01:41:56.471 [INFO][4841] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" Namespace="kube-system" Pod="coredns-674b8bbfcf-fk5c6" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--fk5c6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--fk5c6-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4d0d03be-d2cc-4c75-ae9e-07c3631ee9eb", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 41, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-96753e66ce", ContainerID:"", Pod:"coredns-674b8bbfcf-fk5c6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.20.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1337fcf62fd", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:41:56.492908 containerd[1676]: 2026-01-14 01:41:56.471 [INFO][4841] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.20.7/32] ContainerID="c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" Namespace="kube-system" Pod="coredns-674b8bbfcf-fk5c6" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--fk5c6-eth0" Jan 14 01:41:56.492908 containerd[1676]: 2026-01-14 01:41:56.471 [INFO][4841] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1337fcf62fd ContainerID="c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" Namespace="kube-system" Pod="coredns-674b8bbfcf-fk5c6" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--fk5c6-eth0" Jan 14 01:41:56.492908 containerd[1676]: 2026-01-14 01:41:56.475 [INFO][4841] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" Namespace="kube-system" Pod="coredns-674b8bbfcf-fk5c6" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--fk5c6-eth0" Jan 14 01:41:56.492908 containerd[1676]: 2026-01-14 01:41:56.475 [INFO][4841] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" Namespace="kube-system" Pod="coredns-674b8bbfcf-fk5c6" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--fk5c6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--fk5c6-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4d0d03be-d2cc-4c75-ae9e-07c3631ee9eb", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 41, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-96753e66ce", ContainerID:"c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c", Pod:"coredns-674b8bbfcf-fk5c6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.20.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1337fcf62fd", MAC:"7a:0f:49:2c:fb:c0", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:41:56.492908 containerd[1676]: 2026-01-14 01:41:56.490 [INFO][4841] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" Namespace="kube-system" Pod="coredns-674b8bbfcf-fk5c6" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-coredns--674b8bbfcf--fk5c6-eth0" Jan 14 01:41:56.495532 containerd[1676]: time="2026-01-14T01:41:56.495471757Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:41:56.509000 audit[4968]: NETFILTER_CFG table=filter:140 family=2 entries=48 op=nft_register_chain pid=4968 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:41:56.509000 audit[4968]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=22704 a0=3 a1=ffffec430cc0 a2=0 a3=ffff9b4bbfa8 items=0 ppid=4344 pid=4968 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.509000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:41:56.524195 containerd[1676]: time="2026-01-14T01:41:56.524145229Z" level=info msg="connecting to shim c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c" address="unix:///run/containerd/s/916389b698c761f42ae4b0c7e3bf653c30ce54cb5fbfc782f0cf1fb9c5ad9eb4" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:41:56.547974 systemd[1]: Started cri-containerd-c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c.scope - libcontainer container c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c. Jan 14 01:41:56.561000 audit: BPF prog-id=246 op=LOAD Jan 14 01:41:56.561000 audit: BPF prog-id=247 op=LOAD Jan 14 01:41:56.561000 audit[4989]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128180 a2=98 a3=0 items=0 ppid=4978 pid=4989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.561000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332396533666363663864613664636631386466326634316631626165 Jan 14 01:41:56.562000 audit: BPF prog-id=247 op=UNLOAD Jan 14 01:41:56.562000 audit[4989]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4978 pid=4989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.562000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332396533666363663864613664636631386466326634316631626165 Jan 14 01:41:56.562000 audit: BPF prog-id=248 op=LOAD Jan 14 01:41:56.562000 audit[4989]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001283e8 a2=98 a3=0 items=0 ppid=4978 pid=4989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.562000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332396533666363663864613664636631386466326634316631626165 Jan 14 01:41:56.562000 audit: BPF prog-id=249 op=LOAD Jan 14 01:41:56.562000 audit[4989]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000128168 a2=98 a3=0 items=0 ppid=4978 pid=4989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.562000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332396533666363663864613664636631386466326634316631626165 Jan 14 01:41:56.562000 audit: BPF prog-id=249 op=UNLOAD Jan 14 01:41:56.562000 audit[4989]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4978 pid=4989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.562000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332396533666363663864613664636631386466326634316631626165 Jan 14 01:41:56.562000 audit: BPF prog-id=248 op=UNLOAD Jan 14 01:41:56.562000 audit[4989]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4978 pid=4989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.562000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332396533666363663864613664636631386466326634316631626165 Jan 14 01:41:56.562000 audit: BPF prog-id=250 op=LOAD Jan 14 01:41:56.562000 audit[4989]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128648 a2=98 a3=0 items=0 ppid=4978 pid=4989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.562000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6332396533666363663864613664636631386466326634316631626165 Jan 14 01:41:56.571760 systemd-networkd[1586]: cali16f1a8fcc2d: Link UP Jan 14 01:41:56.572781 systemd-networkd[1586]: cali16f1a8fcc2d: Gained carrier Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.296 [INFO][4843] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4578--0--0--p--96753e66ce-k8s-goldmane--666569f655--jg7bs-eth0 goldmane-666569f655- calico-system e6d84f7a-c9f6-41d8-94ff-304c6e803e1e 828 0 2026-01-14 01:41:26 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4578-0-0-p-96753e66ce goldmane-666569f655-jg7bs eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali16f1a8fcc2d [] [] }} ContainerID="9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" Namespace="calico-system" Pod="goldmane-666569f655-jg7bs" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-goldmane--666569f655--jg7bs-" Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.296 [INFO][4843] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" Namespace="calico-system" Pod="goldmane-666569f655-jg7bs" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-goldmane--666569f655--jg7bs-eth0" Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.326 [INFO][4889] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" HandleID="k8s-pod-network.9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" Workload="ci--4578--0--0--p--96753e66ce-k8s-goldmane--666569f655--jg7bs-eth0" Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.326 [INFO][4889] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" HandleID="k8s-pod-network.9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" Workload="ci--4578--0--0--p--96753e66ce-k8s-goldmane--666569f655--jg7bs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400031f4b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4578-0-0-p-96753e66ce", "pod":"goldmane-666569f655-jg7bs", "timestamp":"2026-01-14 01:41:56.326448333 +0000 UTC"}, Hostname:"ci-4578-0-0-p-96753e66ce", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.326 [INFO][4889] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.469 [INFO][4889] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.469 [INFO][4889] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4578-0-0-p-96753e66ce' Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.532 [INFO][4889] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.539 [INFO][4889] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.545 [INFO][4889] ipam/ipam.go 511: Trying affinity for 192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.547 [INFO][4889] ipam/ipam.go 158: Attempting to load block cidr=192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.550 [INFO][4889] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.20.0/26 host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.551 [INFO][4889] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.20.0/26 handle="k8s-pod-network.9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.553 [INFO][4889] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3 Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.557 [INFO][4889] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.20.0/26 handle="k8s-pod-network.9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.566 [INFO][4889] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.20.8/26] block=192.168.20.0/26 handle="k8s-pod-network.9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.566 [INFO][4889] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.20.8/26] handle="k8s-pod-network.9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" host="ci-4578-0-0-p-96753e66ce" Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.566 [INFO][4889] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 01:41:56.595109 containerd[1676]: 2026-01-14 01:41:56.566 [INFO][4889] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.20.8/26] IPv6=[] ContainerID="9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" HandleID="k8s-pod-network.9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" Workload="ci--4578--0--0--p--96753e66ce-k8s-goldmane--666569f655--jg7bs-eth0" Jan 14 01:41:56.595713 containerd[1676]: 2026-01-14 01:41:56.568 [INFO][4843] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" Namespace="calico-system" Pod="goldmane-666569f655-jg7bs" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-goldmane--666569f655--jg7bs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--96753e66ce-k8s-goldmane--666569f655--jg7bs-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"e6d84f7a-c9f6-41d8-94ff-304c6e803e1e", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 41, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-96753e66ce", ContainerID:"", Pod:"goldmane-666569f655-jg7bs", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.20.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali16f1a8fcc2d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:41:56.595713 containerd[1676]: 2026-01-14 01:41:56.569 [INFO][4843] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.20.8/32] ContainerID="9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" Namespace="calico-system" Pod="goldmane-666569f655-jg7bs" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-goldmane--666569f655--jg7bs-eth0" Jan 14 01:41:56.595713 containerd[1676]: 2026-01-14 01:41:56.569 [INFO][4843] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali16f1a8fcc2d ContainerID="9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" Namespace="calico-system" Pod="goldmane-666569f655-jg7bs" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-goldmane--666569f655--jg7bs-eth0" Jan 14 01:41:56.595713 containerd[1676]: 2026-01-14 01:41:56.573 [INFO][4843] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" Namespace="calico-system" Pod="goldmane-666569f655-jg7bs" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-goldmane--666569f655--jg7bs-eth0" Jan 14 01:41:56.595713 containerd[1676]: 2026-01-14 01:41:56.574 [INFO][4843] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" Namespace="calico-system" Pod="goldmane-666569f655-jg7bs" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-goldmane--666569f655--jg7bs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4578--0--0--p--96753e66ce-k8s-goldmane--666569f655--jg7bs-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"e6d84f7a-c9f6-41d8-94ff-304c6e803e1e", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 1, 41, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4578-0-0-p-96753e66ce", ContainerID:"9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3", Pod:"goldmane-666569f655-jg7bs", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.20.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali16f1a8fcc2d", MAC:"ee:61:75:d2:f8:36", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 01:41:56.595713 containerd[1676]: 2026-01-14 01:41:56.589 [INFO][4843] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" Namespace="calico-system" Pod="goldmane-666569f655-jg7bs" WorkloadEndpoint="ci--4578--0--0--p--96753e66ce-k8s-goldmane--666569f655--jg7bs-eth0" Jan 14 01:41:56.605455 containerd[1676]: time="2026-01-14T01:41:56.605406633Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-fk5c6,Uid:4d0d03be-d2cc-4c75-ae9e-07c3631ee9eb,Namespace:kube-system,Attempt:0,} returns sandbox id \"c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c\"" Jan 14 01:41:56.611492 containerd[1676]: time="2026-01-14T01:41:56.611446648Z" level=info msg="CreateContainer within sandbox \"c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 14 01:41:56.615000 audit[5026]: NETFILTER_CFG table=filter:141 family=2 entries=70 op=nft_register_chain pid=5026 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 01:41:56.615000 audit[5026]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=33956 a0=3 a1=ffffe42e80f0 a2=0 a3=ffffaf3d0fa8 items=0 ppid=4344 pid=5026 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.615000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 01:41:56.629927 containerd[1676]: time="2026-01-14T01:41:56.629890095Z" level=info msg="Container 8bac4c13eee05205d8be72d744a4c6ec385a2831f0945a528e1ba1a5af461933: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:41:56.634922 containerd[1676]: time="2026-01-14T01:41:56.634800147Z" level=info msg="connecting to shim 9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3" address="unix:///run/containerd/s/d93c058db123ff820e079d5c983ea154dce31b214e0cebd59fa47e0b1fd422c5" namespace=k8s.io protocol=ttrpc version=3 Jan 14 01:41:56.636925 containerd[1676]: time="2026-01-14T01:41:56.636891872Z" level=info msg="CreateContainer within sandbox \"c29e3fccf8da6dcf18df2f41f1bae2ba08a595ba8d42215a72f9da4c925d946c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8bac4c13eee05205d8be72d744a4c6ec385a2831f0945a528e1ba1a5af461933\"" Jan 14 01:41:56.638628 containerd[1676]: time="2026-01-14T01:41:56.638589596Z" level=info msg="StartContainer for \"8bac4c13eee05205d8be72d744a4c6ec385a2831f0945a528e1ba1a5af461933\"" Jan 14 01:41:56.639687 containerd[1676]: time="2026-01-14T01:41:56.639642959Z" level=info msg="connecting to shim 8bac4c13eee05205d8be72d744a4c6ec385a2831f0945a528e1ba1a5af461933" address="unix:///run/containerd/s/916389b698c761f42ae4b0c7e3bf653c30ce54cb5fbfc782f0cf1fb9c5ad9eb4" protocol=ttrpc version=3 Jan 14 01:41:56.662955 systemd[1]: Started cri-containerd-9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3.scope - libcontainer container 9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3. Jan 14 01:41:56.666227 systemd[1]: Started cri-containerd-8bac4c13eee05205d8be72d744a4c6ec385a2831f0945a528e1ba1a5af461933.scope - libcontainer container 8bac4c13eee05205d8be72d744a4c6ec385a2831f0945a528e1ba1a5af461933. Jan 14 01:41:56.674000 audit: BPF prog-id=251 op=LOAD Jan 14 01:41:56.674000 audit: BPF prog-id=252 op=LOAD Jan 14 01:41:56.674000 audit[5048]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=5035 pid=5048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.674000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931353764613635363865386538333964653032353632643639663037 Jan 14 01:41:56.674000 audit: BPF prog-id=252 op=UNLOAD Jan 14 01:41:56.674000 audit[5048]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5035 pid=5048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.674000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931353764613635363865386538333964653032353632643639663037 Jan 14 01:41:56.674000 audit: BPF prog-id=253 op=LOAD Jan 14 01:41:56.674000 audit[5048]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=5035 pid=5048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.674000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931353764613635363865386538333964653032353632643639663037 Jan 14 01:41:56.675000 audit: BPF prog-id=254 op=LOAD Jan 14 01:41:56.675000 audit[5048]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=5035 pid=5048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.675000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931353764613635363865386538333964653032353632643639663037 Jan 14 01:41:56.675000 audit: BPF prog-id=254 op=UNLOAD Jan 14 01:41:56.675000 audit[5048]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5035 pid=5048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.675000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931353764613635363865386538333964653032353632643639663037 Jan 14 01:41:56.675000 audit: BPF prog-id=253 op=UNLOAD Jan 14 01:41:56.675000 audit[5048]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5035 pid=5048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.675000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931353764613635363865386538333964653032353632643639663037 Jan 14 01:41:56.675000 audit: BPF prog-id=255 op=LOAD Jan 14 01:41:56.675000 audit[5048]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=5035 pid=5048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.675000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3931353764613635363865386538333964653032353632643639663037 Jan 14 01:41:56.676000 audit: BPF prog-id=256 op=LOAD Jan 14 01:41:56.677000 audit: BPF prog-id=257 op=LOAD Jan 14 01:41:56.677000 audit[5046]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4978 pid=5046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.677000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862616334633133656565303532303564386265373264373434613463 Jan 14 01:41:56.677000 audit: BPF prog-id=257 op=UNLOAD Jan 14 01:41:56.677000 audit[5046]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4978 pid=5046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.677000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862616334633133656565303532303564386265373264373434613463 Jan 14 01:41:56.677000 audit: BPF prog-id=258 op=LOAD Jan 14 01:41:56.677000 audit[5046]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4978 pid=5046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.677000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862616334633133656565303532303564386265373264373434613463 Jan 14 01:41:56.678000 audit: BPF prog-id=259 op=LOAD Jan 14 01:41:56.678000 audit[5046]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4978 pid=5046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.678000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862616334633133656565303532303564386265373264373434613463 Jan 14 01:41:56.678000 audit: BPF prog-id=259 op=UNLOAD Jan 14 01:41:56.678000 audit[5046]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4978 pid=5046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.678000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862616334633133656565303532303564386265373264373434613463 Jan 14 01:41:56.678000 audit: BPF prog-id=258 op=UNLOAD Jan 14 01:41:56.678000 audit[5046]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4978 pid=5046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.678000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862616334633133656565303532303564386265373264373434613463 Jan 14 01:41:56.678000 audit: BPF prog-id=260 op=LOAD Jan 14 01:41:56.678000 audit[5046]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4978 pid=5046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:56.678000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3862616334633133656565303532303564386265373264373434613463 Jan 14 01:41:56.706433 containerd[1676]: time="2026-01-14T01:41:56.706385726Z" level=info msg="StartContainer for \"8bac4c13eee05205d8be72d744a4c6ec385a2831f0945a528e1ba1a5af461933\" returns successfully" Jan 14 01:41:56.714437 containerd[1676]: time="2026-01-14T01:41:56.714390306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-jg7bs,Uid:e6d84f7a-c9f6-41d8-94ff-304c6e803e1e,Namespace:calico-system,Attempt:0,} returns sandbox id \"9157da6568e8e839de02562d69f075a7582488547445887619cba5857bdc64b3\"" Jan 14 01:41:56.731587 kubelet[2905]: I0114 01:41:56.731515 2905 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 14 01:41:56.829937 containerd[1676]: time="2026-01-14T01:41:56.829843756Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:41:56.836846 containerd[1676]: time="2026-01-14T01:41:56.836709933Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:41:56.836846 containerd[1676]: time="2026-01-14T01:41:56.836796213Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:41:56.837195 kubelet[2905]: E0114 01:41:56.837147 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:41:56.837252 kubelet[2905]: E0114 01:41:56.837196 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:41:56.837451 kubelet[2905]: E0114 01:41:56.837389 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pf62l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-b67dd7dbc-k5lcl_calico-system(8f23c871-1821-4e53-80e3-947513960a4b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:41:56.837918 containerd[1676]: time="2026-01-14T01:41:56.837886376Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:41:56.839134 kubelet[2905]: E0114 01:41:56.839090 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" podUID="8f23c871-1821-4e53-80e3-947513960a4b" Jan 14 01:41:57.158505 containerd[1676]: time="2026-01-14T01:41:57.158192380Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:41:57.164238 containerd[1676]: time="2026-01-14T01:41:57.164127555Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:41:57.164537 containerd[1676]: time="2026-01-14T01:41:57.164476915Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:41:57.164788 kubelet[2905]: E0114 01:41:57.164741 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:41:57.164857 kubelet[2905]: E0114 01:41:57.164795 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:41:57.164980 kubelet[2905]: E0114 01:41:57.164929 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tk8sx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-jg7bs_calico-system(e6d84f7a-c9f6-41d8-94ff-304c6e803e1e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:41:57.167172 kubelet[2905]: E0114 01:41:57.167006 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-jg7bs" podUID="e6d84f7a-c9f6-41d8-94ff-304c6e803e1e" Jan 14 01:41:57.186977 systemd-networkd[1586]: calie5e21ee2b2c: Gained IPv6LL Jan 14 01:41:57.377394 kubelet[2905]: E0114 01:41:57.377147 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" podUID="8f23c871-1821-4e53-80e3-947513960a4b" Jan 14 01:41:57.378943 kubelet[2905]: E0114 01:41:57.378548 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-jg7bs" podUID="e6d84f7a-c9f6-41d8-94ff-304c6e803e1e" Jan 14 01:41:57.385888 kubelet[2905]: E0114 01:41:57.385844 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:41:57.415482 kubelet[2905]: I0114 01:41:57.415346 2905 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-fk5c6" podStartSLOduration=45.415330985 podStartE2EDuration="45.415330985s" podCreationTimestamp="2026-01-14 01:41:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 01:41:57.414965544 +0000 UTC m=+52.288451659" watchObservedRunningTime="2026-01-14 01:41:57.415330985 +0000 UTC m=+52.288817060" Jan 14 01:41:57.432000 audit[5165]: NETFILTER_CFG table=filter:142 family=2 entries=14 op=nft_register_rule pid=5165 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:57.432000 audit[5165]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff2c160e0 a2=0 a3=1 items=0 ppid=3072 pid=5165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:57.432000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:57.454000 audit[5165]: NETFILTER_CFG table=nat:143 family=2 entries=56 op=nft_register_chain pid=5165 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:57.454000 audit[5165]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=fffff2c160e0 a2=0 a3=1 items=0 ppid=3072 pid=5165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:57.454000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:57.478000 audit[5168]: NETFILTER_CFG table=filter:144 family=2 entries=14 op=nft_register_rule pid=5168 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:57.478000 audit[5168]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff6753d70 a2=0 a3=1 items=0 ppid=3072 pid=5168 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:57.478000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:57.486000 audit[5168]: NETFILTER_CFG table=nat:145 family=2 entries=20 op=nft_register_rule pid=5168 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:41:57.486000 audit[5168]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=fffff6753d70 a2=0 a3=1 items=0 ppid=3072 pid=5168 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:41:57.486000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:41:57.507954 systemd-networkd[1586]: cali711549e3cd5: Gained IPv6LL Jan 14 01:41:57.955907 systemd-networkd[1586]: cali16f1a8fcc2d: Gained IPv6LL Jan 14 01:41:57.956731 systemd-networkd[1586]: cali1337fcf62fd: Gained IPv6LL Jan 14 01:41:58.385220 kubelet[2905]: E0114 01:41:58.385160 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" podUID="8f23c871-1821-4e53-80e3-947513960a4b" Jan 14 01:41:58.385714 kubelet[2905]: E0114 01:41:58.385595 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-jg7bs" podUID="e6d84f7a-c9f6-41d8-94ff-304c6e803e1e" Jan 14 01:42:03.231790 containerd[1676]: time="2026-01-14T01:42:03.230784372Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:42:03.571021 containerd[1676]: time="2026-01-14T01:42:03.570958985Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:42:03.572904 containerd[1676]: time="2026-01-14T01:42:03.572836150Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:42:03.572980 containerd[1676]: time="2026-01-14T01:42:03.572910830Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:42:03.573119 kubelet[2905]: E0114 01:42:03.573049 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:42:03.573119 kubelet[2905]: E0114 01:42:03.573115 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:42:03.573572 kubelet[2905]: E0114 01:42:03.573232 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1bdf15913aa7461abab7765f5b689915,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gtqjc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-77b7df9c9-vcm8x_calico-system(afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:42:03.575390 containerd[1676]: time="2026-01-14T01:42:03.575292396Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:42:04.064943 containerd[1676]: time="2026-01-14T01:42:04.064867264Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:42:04.069747 containerd[1676]: time="2026-01-14T01:42:04.069114314Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:42:04.069747 containerd[1676]: time="2026-01-14T01:42:04.069217515Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:42:04.069878 kubelet[2905]: E0114 01:42:04.069393 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:42:04.069878 kubelet[2905]: E0114 01:42:04.069440 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:42:04.069878 kubelet[2905]: E0114 01:42:04.069547 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gtqjc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-77b7df9c9-vcm8x_calico-system(afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:42:04.071145 kubelet[2905]: E0114 01:42:04.071086 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-77b7df9c9-vcm8x" podUID="afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11" Jan 14 01:42:09.235712 containerd[1676]: time="2026-01-14T01:42:09.233880469Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:42:09.567748 containerd[1676]: time="2026-01-14T01:42:09.567676066Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:42:09.569232 containerd[1676]: time="2026-01-14T01:42:09.569172510Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:42:09.569313 containerd[1676]: time="2026-01-14T01:42:09.569245190Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:42:09.569443 kubelet[2905]: E0114 01:42:09.569406 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:42:09.569822 kubelet[2905]: E0114 01:42:09.569456 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:42:09.569822 kubelet[2905]: E0114 01:42:09.569644 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9blv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-675898b8d4-fz6xg_calico-apiserver(ac329ad1-edb3-4891-9a01-4d5e568d082e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:42:09.570012 containerd[1676]: time="2026-01-14T01:42:09.569951192Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:42:09.571234 kubelet[2905]: E0114 01:42:09.571198 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" podUID="ac329ad1-edb3-4891-9a01-4d5e568d082e" Jan 14 01:42:09.905024 containerd[1676]: time="2026-01-14T01:42:09.904635552Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:42:09.906083 containerd[1676]: time="2026-01-14T01:42:09.906047355Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:42:09.906161 containerd[1676]: time="2026-01-14T01:42:09.906077195Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:42:09.906327 kubelet[2905]: E0114 01:42:09.906283 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:42:09.906397 kubelet[2905]: E0114 01:42:09.906338 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:42:09.906510 kubelet[2905]: E0114 01:42:09.906470 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fjtvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-675898b8d4-dphrr_calico-apiserver(adf9db04-ef07-4e4b-ac7b-0a044973dca8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:42:09.907669 kubelet[2905]: E0114 01:42:09.907635 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" podUID="adf9db04-ef07-4e4b-ac7b-0a044973dca8" Jan 14 01:42:10.230814 containerd[1676]: time="2026-01-14T01:42:10.230676970Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:42:10.530346 containerd[1676]: time="2026-01-14T01:42:10.530299201Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:42:10.531539 containerd[1676]: time="2026-01-14T01:42:10.531503284Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:42:10.533880 kubelet[2905]: E0114 01:42:10.531691 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:42:10.533880 kubelet[2905]: E0114 01:42:10.531759 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:42:10.533880 kubelet[2905]: E0114 01:42:10.532008 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tk8sx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-jg7bs_calico-system(e6d84f7a-c9f6-41d8-94ff-304c6e803e1e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:42:10.534107 containerd[1676]: time="2026-01-14T01:42:10.531579524Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:42:10.534107 containerd[1676]: time="2026-01-14T01:42:10.532062165Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:42:10.534163 kubelet[2905]: E0114 01:42:10.534106 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-jg7bs" podUID="e6d84f7a-c9f6-41d8-94ff-304c6e803e1e" Jan 14 01:42:10.878423 containerd[1676]: time="2026-01-14T01:42:10.878228354Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:42:10.880029 containerd[1676]: time="2026-01-14T01:42:10.879951958Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:42:10.880126 containerd[1676]: time="2026-01-14T01:42:10.880031158Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:42:10.880247 kubelet[2905]: E0114 01:42:10.880202 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:42:10.880523 kubelet[2905]: E0114 01:42:10.880264 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:42:10.882855 kubelet[2905]: E0114 01:42:10.882780 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bz2bm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-twlzn_calico-system(1b8220b4-811d-4471-95d7-cea88df93438): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:42:10.884970 containerd[1676]: time="2026-01-14T01:42:10.884923611Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:42:11.239569 containerd[1676]: time="2026-01-14T01:42:11.238960539Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:42:11.240889 containerd[1676]: time="2026-01-14T01:42:11.240826983Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:42:11.240977 containerd[1676]: time="2026-01-14T01:42:11.240927344Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:42:11.241124 kubelet[2905]: E0114 01:42:11.241083 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:42:11.241185 kubelet[2905]: E0114 01:42:11.241166 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:42:11.241782 kubelet[2905]: E0114 01:42:11.241296 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bz2bm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-twlzn_calico-system(1b8220b4-811d-4471-95d7-cea88df93438): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:42:11.243146 kubelet[2905]: E0114 01:42:11.242893 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:42:12.231754 containerd[1676]: time="2026-01-14T01:42:12.230105225Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:42:12.558107 containerd[1676]: time="2026-01-14T01:42:12.557934807Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:42:12.559498 containerd[1676]: time="2026-01-14T01:42:12.559367091Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:42:12.559498 containerd[1676]: time="2026-01-14T01:42:12.559442451Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:42:12.559803 kubelet[2905]: E0114 01:42:12.559766 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:42:12.560373 kubelet[2905]: E0114 01:42:12.560131 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:42:12.560373 kubelet[2905]: E0114 01:42:12.560309 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pf62l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-b67dd7dbc-k5lcl_calico-system(8f23c871-1821-4e53-80e3-947513960a4b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:42:12.561694 kubelet[2905]: E0114 01:42:12.561661 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" podUID="8f23c871-1821-4e53-80e3-947513960a4b" Jan 14 01:42:16.235519 kubelet[2905]: E0114 01:42:16.235381 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-77b7df9c9-vcm8x" podUID="afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11" Jan 14 01:42:21.230318 kubelet[2905]: E0114 01:42:21.229988 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" podUID="adf9db04-ef07-4e4b-ac7b-0a044973dca8" Jan 14 01:42:22.230319 kubelet[2905]: E0114 01:42:22.230272 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" podUID="ac329ad1-edb3-4891-9a01-4d5e568d082e" Jan 14 01:42:23.235097 kubelet[2905]: E0114 01:42:23.235025 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:42:26.230221 kubelet[2905]: E0114 01:42:26.230143 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-jg7bs" podUID="e6d84f7a-c9f6-41d8-94ff-304c6e803e1e" Jan 14 01:42:27.230536 kubelet[2905]: E0114 01:42:27.230471 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" podUID="8f23c871-1821-4e53-80e3-947513960a4b" Jan 14 01:42:31.231451 containerd[1676]: time="2026-01-14T01:42:31.231363206Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:42:31.567896 containerd[1676]: time="2026-01-14T01:42:31.567851090Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:42:31.569430 containerd[1676]: time="2026-01-14T01:42:31.569390214Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:42:31.569538 containerd[1676]: time="2026-01-14T01:42:31.569468094Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:42:31.569636 kubelet[2905]: E0114 01:42:31.569601 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:42:31.569928 kubelet[2905]: E0114 01:42:31.569648 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:42:31.569928 kubelet[2905]: E0114 01:42:31.569784 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1bdf15913aa7461abab7765f5b689915,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gtqjc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-77b7df9c9-vcm8x_calico-system(afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:42:31.572315 containerd[1676]: time="2026-01-14T01:42:31.572265221Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:42:31.909935 containerd[1676]: time="2026-01-14T01:42:31.908913785Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:42:31.917045 containerd[1676]: time="2026-01-14T01:42:31.916983806Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:42:31.917174 containerd[1676]: time="2026-01-14T01:42:31.917089006Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:42:31.917414 kubelet[2905]: E0114 01:42:31.917272 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:42:31.917414 kubelet[2905]: E0114 01:42:31.917326 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:42:31.917881 kubelet[2905]: E0114 01:42:31.917800 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gtqjc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-77b7df9c9-vcm8x_calico-system(afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:42:31.919151 kubelet[2905]: E0114 01:42:31.918987 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-77b7df9c9-vcm8x" podUID="afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11" Jan 14 01:42:36.230480 containerd[1676]: time="2026-01-14T01:42:36.230393385Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:42:36.564288 containerd[1676]: time="2026-01-14T01:42:36.564231903Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:42:36.565831 containerd[1676]: time="2026-01-14T01:42:36.565776186Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:42:36.565918 containerd[1676]: time="2026-01-14T01:42:36.565825947Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:42:36.566152 kubelet[2905]: E0114 01:42:36.566092 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:42:36.566152 kubelet[2905]: E0114 01:42:36.566145 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:42:36.566944 kubelet[2905]: E0114 01:42:36.566392 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fjtvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-675898b8d4-dphrr_calico-apiserver(adf9db04-ef07-4e4b-ac7b-0a044973dca8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:42:36.567061 containerd[1676]: time="2026-01-14T01:42:36.566461228Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:42:36.568128 kubelet[2905]: E0114 01:42:36.567941 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" podUID="adf9db04-ef07-4e4b-ac7b-0a044973dca8" Jan 14 01:42:36.902376 containerd[1676]: time="2026-01-14T01:42:36.901620669Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:42:36.903233 containerd[1676]: time="2026-01-14T01:42:36.903188233Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:42:36.903395 containerd[1676]: time="2026-01-14T01:42:36.903226513Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:42:36.903455 kubelet[2905]: E0114 01:42:36.903401 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:42:36.903535 kubelet[2905]: E0114 01:42:36.903461 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:42:36.904018 kubelet[2905]: E0114 01:42:36.903669 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9blv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-675898b8d4-fz6xg_calico-apiserver(ac329ad1-edb3-4891-9a01-4d5e568d082e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:42:36.904181 containerd[1676]: time="2026-01-14T01:42:36.903757314Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:42:36.905115 kubelet[2905]: E0114 01:42:36.905084 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" podUID="ac329ad1-edb3-4891-9a01-4d5e568d082e" Jan 14 01:42:37.247358 containerd[1676]: time="2026-01-14T01:42:37.246930735Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:42:37.248354 containerd[1676]: time="2026-01-14T01:42:37.248317658Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:42:37.248585 containerd[1676]: time="2026-01-14T01:42:37.248384619Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:42:37.248628 kubelet[2905]: E0114 01:42:37.248507 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:42:37.248628 kubelet[2905]: E0114 01:42:37.248551 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:42:37.248960 kubelet[2905]: E0114 01:42:37.248913 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bz2bm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-twlzn_calico-system(1b8220b4-811d-4471-95d7-cea88df93438): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:42:37.252063 containerd[1676]: time="2026-01-14T01:42:37.252035828Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:42:37.571171 containerd[1676]: time="2026-01-14T01:42:37.571118948Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:42:37.572731 containerd[1676]: time="2026-01-14T01:42:37.572654712Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:42:37.573273 containerd[1676]: time="2026-01-14T01:42:37.572708152Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:42:37.573497 kubelet[2905]: E0114 01:42:37.573449 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:42:37.574065 kubelet[2905]: E0114 01:42:37.573783 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:42:37.574065 kubelet[2905]: E0114 01:42:37.573923 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bz2bm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-twlzn_calico-system(1b8220b4-811d-4471-95d7-cea88df93438): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:42:37.575237 kubelet[2905]: E0114 01:42:37.575099 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:42:41.231454 containerd[1676]: time="2026-01-14T01:42:41.231412089Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:42:41.571140 containerd[1676]: time="2026-01-14T01:42:41.571079741Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:42:41.573015 containerd[1676]: time="2026-01-14T01:42:41.572970906Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:42:41.573117 containerd[1676]: time="2026-01-14T01:42:41.573055626Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:42:41.573419 kubelet[2905]: E0114 01:42:41.573385 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:42:41.573704 kubelet[2905]: E0114 01:42:41.573432 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:42:41.573704 kubelet[2905]: E0114 01:42:41.573627 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tk8sx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-jg7bs_calico-system(e6d84f7a-c9f6-41d8-94ff-304c6e803e1e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:42:41.574062 containerd[1676]: time="2026-01-14T01:42:41.573943029Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:42:41.574942 kubelet[2905]: E0114 01:42:41.574900 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-jg7bs" podUID="e6d84f7a-c9f6-41d8-94ff-304c6e803e1e" Jan 14 01:42:41.891174 containerd[1676]: time="2026-01-14T01:42:41.890640423Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:42:41.892512 containerd[1676]: time="2026-01-14T01:42:41.892446747Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:42:41.892594 containerd[1676]: time="2026-01-14T01:42:41.892538068Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:42:41.893272 kubelet[2905]: E0114 01:42:41.892752 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:42:41.893272 kubelet[2905]: E0114 01:42:41.892801 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:42:41.893272 kubelet[2905]: E0114 01:42:41.892925 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pf62l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-b67dd7dbc-k5lcl_calico-system(8f23c871-1821-4e53-80e3-947513960a4b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:42:41.894454 kubelet[2905]: E0114 01:42:41.894419 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" podUID="8f23c871-1821-4e53-80e3-947513960a4b" Jan 14 01:42:42.232026 kubelet[2905]: E0114 01:42:42.231912 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-77b7df9c9-vcm8x" podUID="afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11" Jan 14 01:42:51.232519 kubelet[2905]: E0114 01:42:51.232287 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" podUID="adf9db04-ef07-4e4b-ac7b-0a044973dca8" Jan 14 01:42:51.233541 kubelet[2905]: E0114 01:42:51.233489 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" podUID="ac329ad1-edb3-4891-9a01-4d5e568d082e" Jan 14 01:42:52.237289 kubelet[2905]: E0114 01:42:52.237217 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:42:53.231320 kubelet[2905]: E0114 01:42:53.231237 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-77b7df9c9-vcm8x" podUID="afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11" Jan 14 01:42:54.230533 kubelet[2905]: E0114 01:42:54.230485 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-jg7bs" podUID="e6d84f7a-c9f6-41d8-94ff-304c6e803e1e" Jan 14 01:42:55.231549 kubelet[2905]: E0114 01:42:55.231505 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" podUID="8f23c871-1821-4e53-80e3-947513960a4b" Jan 14 01:43:03.231031 kubelet[2905]: E0114 01:43:03.230752 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" podUID="ac329ad1-edb3-4891-9a01-4d5e568d082e" Jan 14 01:43:04.229736 kubelet[2905]: E0114 01:43:04.229629 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" podUID="adf9db04-ef07-4e4b-ac7b-0a044973dca8" Jan 14 01:43:05.232398 kubelet[2905]: E0114 01:43:05.232293 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-77b7df9c9-vcm8x" podUID="afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11" Jan 14 01:43:06.229665 kubelet[2905]: E0114 01:43:06.229602 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-jg7bs" podUID="e6d84f7a-c9f6-41d8-94ff-304c6e803e1e" Jan 14 01:43:07.233538 kubelet[2905]: E0114 01:43:07.233457 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:43:08.230981 kubelet[2905]: E0114 01:43:08.230911 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" podUID="8f23c871-1821-4e53-80e3-947513960a4b" Jan 14 01:43:17.230503 containerd[1676]: time="2026-01-14T01:43:17.230461026Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:43:17.563709 containerd[1676]: time="2026-01-14T01:43:17.563453901Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:43:17.564937 containerd[1676]: time="2026-01-14T01:43:17.564797265Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:43:17.564937 containerd[1676]: time="2026-01-14T01:43:17.564868425Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:43:17.565061 kubelet[2905]: E0114 01:43:17.564983 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:43:17.565061 kubelet[2905]: E0114 01:43:17.565025 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:43:17.565350 kubelet[2905]: E0114 01:43:17.565155 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9blv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-675898b8d4-fz6xg_calico-apiserver(ac329ad1-edb3-4891-9a01-4d5e568d082e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:43:17.566423 kubelet[2905]: E0114 01:43:17.566339 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" podUID="ac329ad1-edb3-4891-9a01-4d5e568d082e" Jan 14 01:43:17.898000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.35.206:22-4.153.228.146:56484 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:17.900850 kernel: kauditd_printk_skb: 220 callbacks suppressed Jan 14 01:43:17.900957 kernel: audit: type=1130 audit(1768354997.898:756): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.35.206:22-4.153.228.146:56484 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:17.898995 systemd[1]: Started sshd@9-10.0.35.206:22-4.153.228.146:56484.service - OpenSSH per-connection server daemon (4.153.228.146:56484). Jan 14 01:43:18.231477 containerd[1676]: time="2026-01-14T01:43:18.230638816Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:43:18.428000 audit[5271]: USER_ACCT pid=5271 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:18.429579 sshd[5271]: Accepted publickey for core from 4.153.228.146 port 56484 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:43:18.433807 sshd-session[5271]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:43:18.428000 audit[5271]: CRED_ACQ pid=5271 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:18.439602 kernel: audit: type=1101 audit(1768354998.428:757): pid=5271 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:18.439696 kernel: audit: type=1103 audit(1768354998.428:758): pid=5271 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:18.442018 kernel: audit: type=1006 audit(1768354998.428:759): pid=5271 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 14 01:43:18.428000 audit[5271]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe5c40010 a2=3 a3=0 items=0 ppid=1 pid=5271 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:18.446741 kernel: audit: type=1300 audit(1768354998.428:759): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe5c40010 a2=3 a3=0 items=0 ppid=1 pid=5271 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:18.428000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:18.448938 kernel: audit: type=1327 audit(1768354998.428:759): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:18.451553 systemd-logind[1651]: New session 11 of user core. Jan 14 01:43:18.457947 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 14 01:43:18.462000 audit[5271]: USER_START pid=5271 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:18.464000 audit[5275]: CRED_ACQ pid=5275 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:18.471211 kernel: audit: type=1105 audit(1768354998.462:760): pid=5271 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:18.471289 kernel: audit: type=1103 audit(1768354998.464:761): pid=5275 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:18.562516 containerd[1676]: time="2026-01-14T01:43:18.562325008Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:43:18.564187 containerd[1676]: time="2026-01-14T01:43:18.564129372Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:43:18.564288 containerd[1676]: time="2026-01-14T01:43:18.564166813Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:43:18.564370 kubelet[2905]: E0114 01:43:18.564333 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:43:18.564414 kubelet[2905]: E0114 01:43:18.564379 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:43:18.565486 kubelet[2905]: E0114 01:43:18.565412 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bz2bm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-twlzn_calico-system(1b8220b4-811d-4471-95d7-cea88df93438): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:43:18.567506 containerd[1676]: time="2026-01-14T01:43:18.567455341Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:43:18.789095 sshd[5275]: Connection closed by 4.153.228.146 port 56484 Jan 14 01:43:18.789787 sshd-session[5271]: pam_unix(sshd:session): session closed for user core Jan 14 01:43:18.790000 audit[5271]: USER_END pid=5271 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:18.793714 systemd[1]: sshd@9-10.0.35.206:22-4.153.228.146:56484.service: Deactivated successfully. Jan 14 01:43:18.790000 audit[5271]: CRED_DISP pid=5271 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:18.795451 systemd[1]: session-11.scope: Deactivated successfully. Jan 14 01:43:18.797249 systemd-logind[1651]: Session 11 logged out. Waiting for processes to exit. Jan 14 01:43:18.798180 kernel: audit: type=1106 audit(1768354998.790:762): pid=5271 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:18.798311 kernel: audit: type=1104 audit(1768354998.790:763): pid=5271 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:18.793000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.35.206:22-4.153.228.146:56484 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:18.798105 systemd-logind[1651]: Removed session 11. Jan 14 01:43:18.895138 containerd[1676]: time="2026-01-14T01:43:18.894986962Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:43:18.900094 containerd[1676]: time="2026-01-14T01:43:18.899952615Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:43:18.900094 containerd[1676]: time="2026-01-14T01:43:18.899996295Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:43:18.900301 kubelet[2905]: E0114 01:43:18.900257 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:43:18.900415 kubelet[2905]: E0114 01:43:18.900312 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:43:18.900470 kubelet[2905]: E0114 01:43:18.900433 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bz2bm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-twlzn_calico-system(1b8220b4-811d-4471-95d7-cea88df93438): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:43:18.901659 kubelet[2905]: E0114 01:43:18.901593 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:43:19.230309 containerd[1676]: time="2026-01-14T01:43:19.230201243Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:43:19.564194 containerd[1676]: time="2026-01-14T01:43:19.564123761Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:43:19.565737 containerd[1676]: time="2026-01-14T01:43:19.565674165Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:43:19.565828 containerd[1676]: time="2026-01-14T01:43:19.565739645Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:43:19.565915 kubelet[2905]: E0114 01:43:19.565876 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:43:19.566184 kubelet[2905]: E0114 01:43:19.565923 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:43:19.566184 kubelet[2905]: E0114 01:43:19.566037 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fjtvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-675898b8d4-dphrr_calico-apiserver(adf9db04-ef07-4e4b-ac7b-0a044973dca8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:43:19.567522 kubelet[2905]: E0114 01:43:19.567483 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" podUID="adf9db04-ef07-4e4b-ac7b-0a044973dca8" Jan 14 01:43:20.229938 containerd[1676]: time="2026-01-14T01:43:20.229853191Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:43:20.230272 kubelet[2905]: E0114 01:43:20.230238 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" podUID="8f23c871-1821-4e53-80e3-947513960a4b" Jan 14 01:43:20.559035 containerd[1676]: time="2026-01-14T01:43:20.558966376Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:43:20.562167 containerd[1676]: time="2026-01-14T01:43:20.562110224Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:43:20.562262 containerd[1676]: time="2026-01-14T01:43:20.562185424Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:43:20.562374 kubelet[2905]: E0114 01:43:20.562334 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:43:20.562440 kubelet[2905]: E0114 01:43:20.562380 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:43:20.562662 kubelet[2905]: E0114 01:43:20.562494 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1bdf15913aa7461abab7765f5b689915,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gtqjc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-77b7df9c9-vcm8x_calico-system(afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:43:20.564603 containerd[1676]: time="2026-01-14T01:43:20.564562950Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:43:20.891478 containerd[1676]: time="2026-01-14T01:43:20.891349610Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:43:20.893093 containerd[1676]: time="2026-01-14T01:43:20.893021294Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:43:20.893093 containerd[1676]: time="2026-01-14T01:43:20.893061334Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:43:20.893568 kubelet[2905]: E0114 01:43:20.893310 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:43:20.893568 kubelet[2905]: E0114 01:43:20.893403 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:43:20.893568 kubelet[2905]: E0114 01:43:20.893523 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gtqjc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-77b7df9c9-vcm8x_calico-system(afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:43:20.895184 kubelet[2905]: E0114 01:43:20.895132 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-77b7df9c9-vcm8x" podUID="afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11" Jan 14 01:43:21.230626 kubelet[2905]: E0114 01:43:21.230485 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-jg7bs" podUID="e6d84f7a-c9f6-41d8-94ff-304c6e803e1e" Jan 14 01:43:23.899274 systemd[1]: Started sshd@10-10.0.35.206:22-4.153.228.146:56486.service - OpenSSH per-connection server daemon (4.153.228.146:56486). Jan 14 01:43:23.898000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.35.206:22-4.153.228.146:56486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:23.900121 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:43:23.900183 kernel: audit: type=1130 audit(1768355003.898:765): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.35.206:22-4.153.228.146:56486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:24.448000 audit[5302]: USER_ACCT pid=5302 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:24.453746 kernel: audit: type=1101 audit(1768355004.448:766): pid=5302 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:24.453849 sshd[5302]: Accepted publickey for core from 4.153.228.146 port 56486 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:43:24.455758 sshd-session[5302]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:43:24.454000 audit[5302]: CRED_ACQ pid=5302 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:24.461660 kernel: audit: type=1103 audit(1768355004.454:767): pid=5302 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:24.461858 kernel: audit: type=1006 audit(1768355004.454:768): pid=5302 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Jan 14 01:43:24.461929 kernel: audit: type=1300 audit(1768355004.454:768): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffeb96b4a0 a2=3 a3=0 items=0 ppid=1 pid=5302 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:24.454000 audit[5302]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffeb96b4a0 a2=3 a3=0 items=0 ppid=1 pid=5302 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:24.454000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:24.466920 kernel: audit: type=1327 audit(1768355004.454:768): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:24.469504 systemd-logind[1651]: New session 12 of user core. Jan 14 01:43:24.480081 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 14 01:43:24.483000 audit[5302]: USER_START pid=5302 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:24.485000 audit[5307]: CRED_ACQ pid=5307 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:24.492636 kernel: audit: type=1105 audit(1768355004.483:769): pid=5302 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:24.492714 kernel: audit: type=1103 audit(1768355004.485:770): pid=5307 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:24.847323 sshd[5307]: Connection closed by 4.153.228.146 port 56486 Jan 14 01:43:24.848315 sshd-session[5302]: pam_unix(sshd:session): session closed for user core Jan 14 01:43:24.850000 audit[5302]: USER_END pid=5302 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:24.854099 systemd-logind[1651]: Session 12 logged out. Waiting for processes to exit. Jan 14 01:43:24.854823 systemd[1]: sshd@10-10.0.35.206:22-4.153.228.146:56486.service: Deactivated successfully. Jan 14 01:43:24.850000 audit[5302]: CRED_DISP pid=5302 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:24.856645 systemd[1]: session-12.scope: Deactivated successfully. Jan 14 01:43:24.858113 systemd-logind[1651]: Removed session 12. Jan 14 01:43:24.858727 kernel: audit: type=1106 audit(1768355004.850:771): pid=5302 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:24.858800 kernel: audit: type=1104 audit(1768355004.850:772): pid=5302 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:24.854000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.35.206:22-4.153.228.146:56486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:29.961220 systemd[1]: Started sshd@11-10.0.35.206:22-4.153.228.146:40056.service - OpenSSH per-connection server daemon (4.153.228.146:40056). Jan 14 01:43:29.960000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.35.206:22-4.153.228.146:40056 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:29.962104 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:43:29.962176 kernel: audit: type=1130 audit(1768355009.960:774): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.35.206:22-4.153.228.146:40056 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:30.231109 kubelet[2905]: E0114 01:43:30.230982 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" podUID="ac329ad1-edb3-4891-9a01-4d5e568d082e" Jan 14 01:43:30.231698 kubelet[2905]: E0114 01:43:30.231657 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:43:30.509000 audit[5360]: USER_ACCT pid=5360 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:30.510888 sshd[5360]: Accepted publickey for core from 4.153.228.146 port 40056 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:43:30.514765 kernel: audit: type=1101 audit(1768355010.509:775): pid=5360 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:30.514848 kernel: audit: type=1103 audit(1768355010.514:776): pid=5360 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:30.514000 audit[5360]: CRED_ACQ pid=5360 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:30.515788 sshd-session[5360]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:43:30.519892 kernel: audit: type=1006 audit(1768355010.514:777): pid=5360 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Jan 14 01:43:30.519958 kernel: audit: type=1300 audit(1768355010.514:777): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe87ac000 a2=3 a3=0 items=0 ppid=1 pid=5360 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:30.514000 audit[5360]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe87ac000 a2=3 a3=0 items=0 ppid=1 pid=5360 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:30.523067 systemd-logind[1651]: New session 13 of user core. Jan 14 01:43:30.523608 kernel: audit: type=1327 audit(1768355010.514:777): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:30.514000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:30.532923 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 14 01:43:30.535000 audit[5360]: USER_START pid=5360 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:30.537000 audit[5364]: CRED_ACQ pid=5364 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:30.543227 kernel: audit: type=1105 audit(1768355010.535:778): pid=5360 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:30.543292 kernel: audit: type=1103 audit(1768355010.537:779): pid=5364 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:30.879863 sshd[5364]: Connection closed by 4.153.228.146 port 40056 Jan 14 01:43:30.879216 sshd-session[5360]: pam_unix(sshd:session): session closed for user core Jan 14 01:43:30.879000 audit[5360]: USER_END pid=5360 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:30.883250 systemd-logind[1651]: Session 13 logged out. Waiting for processes to exit. Jan 14 01:43:30.883394 systemd[1]: sshd@11-10.0.35.206:22-4.153.228.146:40056.service: Deactivated successfully. Jan 14 01:43:30.879000 audit[5360]: CRED_DISP pid=5360 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:30.885998 systemd[1]: session-13.scope: Deactivated successfully. Jan 14 01:43:30.887775 kernel: audit: type=1106 audit(1768355010.879:780): pid=5360 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:30.888113 kernel: audit: type=1104 audit(1768355010.879:781): pid=5360 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:30.880000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.35.206:22-4.153.228.146:40056 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:30.888461 systemd-logind[1651]: Removed session 13. Jan 14 01:43:30.992639 systemd[1]: Started sshd@12-10.0.35.206:22-4.153.228.146:40072.service - OpenSSH per-connection server daemon (4.153.228.146:40072). Jan 14 01:43:30.992000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.35.206:22-4.153.228.146:40072 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:31.236085 kubelet[2905]: E0114 01:43:31.235969 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-77b7df9c9-vcm8x" podUID="afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11" Jan 14 01:43:31.548000 audit[5378]: USER_ACCT pid=5378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:31.549034 sshd[5378]: Accepted publickey for core from 4.153.228.146 port 40072 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:43:31.549000 audit[5378]: CRED_ACQ pid=5378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:31.549000 audit[5378]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdcd63480 a2=3 a3=0 items=0 ppid=1 pid=5378 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:31.549000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:31.550588 sshd-session[5378]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:43:31.554760 systemd-logind[1651]: New session 14 of user core. Jan 14 01:43:31.564018 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 14 01:43:31.567000 audit[5378]: USER_START pid=5378 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:31.569000 audit[5382]: CRED_ACQ pid=5382 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:31.945738 sshd[5382]: Connection closed by 4.153.228.146 port 40072 Jan 14 01:43:31.945966 sshd-session[5378]: pam_unix(sshd:session): session closed for user core Jan 14 01:43:31.946000 audit[5378]: USER_END pid=5378 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:31.946000 audit[5378]: CRED_DISP pid=5378 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:31.950089 systemd[1]: sshd@12-10.0.35.206:22-4.153.228.146:40072.service: Deactivated successfully. Jan 14 01:43:31.949000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.35.206:22-4.153.228.146:40072 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:31.952004 systemd[1]: session-14.scope: Deactivated successfully. Jan 14 01:43:31.952722 systemd-logind[1651]: Session 14 logged out. Waiting for processes to exit. Jan 14 01:43:31.953706 systemd-logind[1651]: Removed session 14. Jan 14 01:43:32.057000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.35.206:22-4.153.228.146:40074 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:32.057952 systemd[1]: Started sshd@13-10.0.35.206:22-4.153.228.146:40074.service - OpenSSH per-connection server daemon (4.153.228.146:40074). Jan 14 01:43:32.231581 containerd[1676]: time="2026-01-14T01:43:32.231319974Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 01:43:32.579204 containerd[1676]: time="2026-01-14T01:43:32.578855166Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:43:32.580642 containerd[1676]: time="2026-01-14T01:43:32.580533690Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 01:43:32.580788 containerd[1676]: time="2026-01-14T01:43:32.580654770Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 01:43:32.580878 kubelet[2905]: E0114 01:43:32.580791 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:43:32.580878 kubelet[2905]: E0114 01:43:32.580829 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 01:43:32.581730 kubelet[2905]: E0114 01:43:32.580955 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pf62l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-b67dd7dbc-k5lcl_calico-system(8f23c871-1821-4e53-80e3-947513960a4b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 01:43:32.582189 kubelet[2905]: E0114 01:43:32.582139 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" podUID="8f23c871-1821-4e53-80e3-947513960a4b" Jan 14 01:43:32.607000 audit[5394]: USER_ACCT pid=5394 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:32.608523 sshd[5394]: Accepted publickey for core from 4.153.228.146 port 40074 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:43:32.609000 audit[5394]: CRED_ACQ pid=5394 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:32.609000 audit[5394]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff31ac6d0 a2=3 a3=0 items=0 ppid=1 pid=5394 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:32.609000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:32.611477 sshd-session[5394]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:43:32.617575 systemd-logind[1651]: New session 15 of user core. Jan 14 01:43:32.629958 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 14 01:43:32.632000 audit[5394]: USER_START pid=5394 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:32.634000 audit[5398]: CRED_ACQ pid=5398 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:32.982113 sshd[5398]: Connection closed by 4.153.228.146 port 40074 Jan 14 01:43:32.982677 sshd-session[5394]: pam_unix(sshd:session): session closed for user core Jan 14 01:43:32.983000 audit[5394]: USER_END pid=5394 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:32.983000 audit[5394]: CRED_DISP pid=5394 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:32.987124 systemd[1]: sshd@13-10.0.35.206:22-4.153.228.146:40074.service: Deactivated successfully. Jan 14 01:43:32.986000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.35.206:22-4.153.228.146:40074 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:32.989025 systemd[1]: session-15.scope: Deactivated successfully. Jan 14 01:43:32.991490 systemd-logind[1651]: Session 15 logged out. Waiting for processes to exit. Jan 14 01:43:32.992486 systemd-logind[1651]: Removed session 15. Jan 14 01:43:34.230508 containerd[1676]: time="2026-01-14T01:43:34.230431109Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 01:43:34.552353 containerd[1676]: time="2026-01-14T01:43:34.552247636Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:43:34.553565 containerd[1676]: time="2026-01-14T01:43:34.553518079Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 01:43:34.553655 containerd[1676]: time="2026-01-14T01:43:34.553599159Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 01:43:34.553823 kubelet[2905]: E0114 01:43:34.553786 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:43:34.554101 kubelet[2905]: E0114 01:43:34.553837 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 01:43:34.554101 kubelet[2905]: E0114 01:43:34.553968 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tk8sx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-jg7bs_calico-system(e6d84f7a-c9f6-41d8-94ff-304c6e803e1e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 01:43:34.555348 kubelet[2905]: E0114 01:43:34.555292 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-jg7bs" podUID="e6d84f7a-c9f6-41d8-94ff-304c6e803e1e" Jan 14 01:43:35.229892 kubelet[2905]: E0114 01:43:35.229715 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" podUID="adf9db04-ef07-4e4b-ac7b-0a044973dca8" Jan 14 01:43:38.096461 systemd[1]: Started sshd@14-10.0.35.206:22-4.153.228.146:41764.service - OpenSSH per-connection server daemon (4.153.228.146:41764). Jan 14 01:43:38.095000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.35.206:22-4.153.228.146:41764 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:38.100429 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 14 01:43:38.100516 kernel: audit: type=1130 audit(1768355018.095:801): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.35.206:22-4.153.228.146:41764 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:38.631000 audit[5411]: USER_ACCT pid=5411 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:38.635503 sshd[5411]: Accepted publickey for core from 4.153.228.146 port 41764 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:43:38.637124 kernel: audit: type=1101 audit(1768355018.631:802): pid=5411 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:38.637156 kernel: audit: type=1103 audit(1768355018.635:803): pid=5411 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:38.635000 audit[5411]: CRED_ACQ pid=5411 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:38.636877 sshd-session[5411]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:43:38.641261 kernel: audit: type=1006 audit(1768355018.635:804): pid=5411 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Jan 14 01:43:38.635000 audit[5411]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe022d620 a2=3 a3=0 items=0 ppid=1 pid=5411 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:38.644672 systemd-logind[1651]: New session 16 of user core. Jan 14 01:43:38.645008 kernel: audit: type=1300 audit(1768355018.635:804): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe022d620 a2=3 a3=0 items=0 ppid=1 pid=5411 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:38.635000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:38.646414 kernel: audit: type=1327 audit(1768355018.635:804): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:38.653104 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 14 01:43:38.656000 audit[5411]: USER_START pid=5411 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:38.657000 audit[5415]: CRED_ACQ pid=5415 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:38.663708 kernel: audit: type=1105 audit(1768355018.656:805): pid=5411 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:38.663861 kernel: audit: type=1103 audit(1768355018.657:806): pid=5415 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:39.008851 sshd[5415]: Connection closed by 4.153.228.146 port 41764 Jan 14 01:43:39.009538 sshd-session[5411]: pam_unix(sshd:session): session closed for user core Jan 14 01:43:39.010000 audit[5411]: USER_END pid=5411 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:39.013496 systemd[1]: sshd@14-10.0.35.206:22-4.153.228.146:41764.service: Deactivated successfully. Jan 14 01:43:39.010000 audit[5411]: CRED_DISP pid=5411 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:39.015895 systemd[1]: session-16.scope: Deactivated successfully. Jan 14 01:43:39.018410 kernel: audit: type=1106 audit(1768355019.010:807): pid=5411 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:39.018487 kernel: audit: type=1104 audit(1768355019.010:808): pid=5411 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:39.011000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.35.206:22-4.153.228.146:41764 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:39.017989 systemd-logind[1651]: Session 16 logged out. Waiting for processes to exit. Jan 14 01:43:39.019162 systemd-logind[1651]: Removed session 16. Jan 14 01:43:39.118561 systemd[1]: Started sshd@15-10.0.35.206:22-4.153.228.146:41774.service - OpenSSH per-connection server daemon (4.153.228.146:41774). Jan 14 01:43:39.117000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.35.206:22-4.153.228.146:41774 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:39.673000 audit[5428]: USER_ACCT pid=5428 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:39.674832 sshd[5428]: Accepted publickey for core from 4.153.228.146 port 41774 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:43:39.674000 audit[5428]: CRED_ACQ pid=5428 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:39.674000 audit[5428]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff36e2cb0 a2=3 a3=0 items=0 ppid=1 pid=5428 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:39.674000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:39.675834 sshd-session[5428]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:43:39.682543 systemd-logind[1651]: New session 17 of user core. Jan 14 01:43:39.686914 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 14 01:43:39.689000 audit[5428]: USER_START pid=5428 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:39.691000 audit[5432]: CRED_ACQ pid=5432 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:40.108770 sshd[5432]: Connection closed by 4.153.228.146 port 41774 Jan 14 01:43:40.108914 sshd-session[5428]: pam_unix(sshd:session): session closed for user core Jan 14 01:43:40.111000 audit[5428]: USER_END pid=5428 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:40.111000 audit[5428]: CRED_DISP pid=5428 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:40.115015 systemd[1]: sshd@15-10.0.35.206:22-4.153.228.146:41774.service: Deactivated successfully. Jan 14 01:43:40.116000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.35.206:22-4.153.228.146:41774 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:40.118820 systemd[1]: session-17.scope: Deactivated successfully. Jan 14 01:43:40.120632 systemd-logind[1651]: Session 17 logged out. Waiting for processes to exit. Jan 14 01:43:40.121289 systemd-logind[1651]: Removed session 17. Jan 14 01:43:40.219586 systemd[1]: Started sshd@16-10.0.35.206:22-4.153.228.146:41784.service - OpenSSH per-connection server daemon (4.153.228.146:41784). Jan 14 01:43:40.219000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.35.206:22-4.153.228.146:41784 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:40.757000 audit[5443]: USER_ACCT pid=5443 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:40.758836 sshd[5443]: Accepted publickey for core from 4.153.228.146 port 41784 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:43:40.760000 audit[5443]: CRED_ACQ pid=5443 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:40.760000 audit[5443]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffea4e4500 a2=3 a3=0 items=0 ppid=1 pid=5443 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:40.760000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:40.761872 sshd-session[5443]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:43:40.771847 systemd-logind[1651]: New session 18 of user core. Jan 14 01:43:40.776464 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 14 01:43:40.780000 audit[5443]: USER_START pid=5443 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:40.782000 audit[5447]: CRED_ACQ pid=5447 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:41.526000 audit[5459]: NETFILTER_CFG table=filter:146 family=2 entries=26 op=nft_register_rule pid=5459 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:43:41.526000 audit[5459]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffd3334a50 a2=0 a3=1 items=0 ppid=3072 pid=5459 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:41.526000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:43:41.533000 audit[5459]: NETFILTER_CFG table=nat:147 family=2 entries=20 op=nft_register_rule pid=5459 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:43:41.533000 audit[5459]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffd3334a50 a2=0 a3=1 items=0 ppid=3072 pid=5459 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:41.533000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:43:41.550000 audit[5461]: NETFILTER_CFG table=filter:148 family=2 entries=38 op=nft_register_rule pid=5461 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:43:41.550000 audit[5461]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffda555fb0 a2=0 a3=1 items=0 ppid=3072 pid=5461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:41.550000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:43:41.554000 audit[5461]: NETFILTER_CFG table=nat:149 family=2 entries=20 op=nft_register_rule pid=5461 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:43:41.554000 audit[5461]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffda555fb0 a2=0 a3=1 items=0 ppid=3072 pid=5461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:41.554000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:43:41.635083 sshd[5447]: Connection closed by 4.153.228.146 port 41784 Jan 14 01:43:41.636877 sshd-session[5443]: pam_unix(sshd:session): session closed for user core Jan 14 01:43:41.637000 audit[5443]: USER_END pid=5443 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:41.637000 audit[5443]: CRED_DISP pid=5443 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:41.641106 systemd[1]: sshd@16-10.0.35.206:22-4.153.228.146:41784.service: Deactivated successfully. Jan 14 01:43:41.640000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.35.206:22-4.153.228.146:41784 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:41.644237 systemd[1]: session-18.scope: Deactivated successfully. Jan 14 01:43:41.646284 systemd-logind[1651]: Session 18 logged out. Waiting for processes to exit. Jan 14 01:43:41.647584 systemd-logind[1651]: Removed session 18. Jan 14 01:43:41.746676 systemd[1]: Started sshd@17-10.0.35.206:22-4.153.228.146:41800.service - OpenSSH per-connection server daemon (4.153.228.146:41800). Jan 14 01:43:41.746000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.35.206:22-4.153.228.146:41800 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:42.285645 sshd[5466]: Accepted publickey for core from 4.153.228.146 port 41800 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:43:42.284000 audit[5466]: USER_ACCT pid=5466 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:42.285000 audit[5466]: CRED_ACQ pid=5466 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:42.285000 audit[5466]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff92b6950 a2=3 a3=0 items=0 ppid=1 pid=5466 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:42.285000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:42.287425 sshd-session[5466]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:43:42.292888 systemd-logind[1651]: New session 19 of user core. Jan 14 01:43:42.304126 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 14 01:43:42.307000 audit[5466]: USER_START pid=5466 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:42.308000 audit[5470]: CRED_ACQ pid=5470 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:42.743980 sshd[5470]: Connection closed by 4.153.228.146 port 41800 Jan 14 01:43:42.744427 sshd-session[5466]: pam_unix(sshd:session): session closed for user core Jan 14 01:43:42.745000 audit[5466]: USER_END pid=5466 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:42.745000 audit[5466]: CRED_DISP pid=5466 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:42.749121 systemd[1]: sshd@17-10.0.35.206:22-4.153.228.146:41800.service: Deactivated successfully. Jan 14 01:43:42.750000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.35.206:22-4.153.228.146:41800 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:42.752861 systemd[1]: session-19.scope: Deactivated successfully. Jan 14 01:43:42.753952 systemd-logind[1651]: Session 19 logged out. Waiting for processes to exit. Jan 14 01:43:42.755456 systemd-logind[1651]: Removed session 19. Jan 14 01:43:42.854242 systemd[1]: Started sshd@18-10.0.35.206:22-4.153.228.146:41808.service - OpenSSH per-connection server daemon (4.153.228.146:41808). Jan 14 01:43:42.853000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.35.206:22-4.153.228.146:41808 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:43.231300 kubelet[2905]: E0114 01:43:43.231252 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" podUID="8f23c871-1821-4e53-80e3-947513960a4b" Jan 14 01:43:43.232278 kubelet[2905]: E0114 01:43:43.232214 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" podUID="ac329ad1-edb3-4891-9a01-4d5e568d082e" Jan 14 01:43:43.232536 kubelet[2905]: E0114 01:43:43.232505 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:43:43.389000 audit[5481]: USER_ACCT pid=5481 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:43.390687 sshd[5481]: Accepted publickey for core from 4.153.228.146 port 41808 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:43:43.391243 kernel: kauditd_printk_skb: 47 callbacks suppressed Jan 14 01:43:43.391281 kernel: audit: type=1101 audit(1768355023.389:842): pid=5481 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:43.397769 kernel: audit: type=1103 audit(1768355023.394:843): pid=5481 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:43.394000 audit[5481]: CRED_ACQ pid=5481 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:43.395856 sshd-session[5481]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:43:43.400119 kernel: audit: type=1006 audit(1768355023.394:844): pid=5481 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Jan 14 01:43:43.394000 audit[5481]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc0c201a0 a2=3 a3=0 items=0 ppid=1 pid=5481 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:43.402466 systemd-logind[1651]: New session 20 of user core. Jan 14 01:43:43.405796 kernel: audit: type=1300 audit(1768355023.394:844): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc0c201a0 a2=3 a3=0 items=0 ppid=1 pid=5481 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:43.405859 kernel: audit: type=1327 audit(1768355023.394:844): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:43.394000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:43.409907 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 14 01:43:43.412000 audit[5481]: USER_START pid=5481 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:43.416000 audit[5487]: CRED_ACQ pid=5487 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:43.417787 kernel: audit: type=1105 audit(1768355023.412:845): pid=5481 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:43.421755 kernel: audit: type=1103 audit(1768355023.416:846): pid=5487 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:43.746877 sshd[5487]: Connection closed by 4.153.228.146 port 41808 Jan 14 01:43:43.747162 sshd-session[5481]: pam_unix(sshd:session): session closed for user core Jan 14 01:43:43.749000 audit[5481]: USER_END pid=5481 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:43.752990 systemd-logind[1651]: Session 20 logged out. Waiting for processes to exit. Jan 14 01:43:43.753133 systemd[1]: sshd@18-10.0.35.206:22-4.153.228.146:41808.service: Deactivated successfully. Jan 14 01:43:43.749000 audit[5481]: CRED_DISP pid=5481 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:43.754749 kernel: audit: type=1106 audit(1768355023.749:847): pid=5481 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:43.755776 systemd[1]: session-20.scope: Deactivated successfully. Jan 14 01:43:43.753000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.35.206:22-4.153.228.146:41808 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:43.757901 systemd-logind[1651]: Removed session 20. Jan 14 01:43:43.760653 kernel: audit: type=1104 audit(1768355023.749:848): pid=5481 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:43.760757 kernel: audit: type=1131 audit(1768355023.753:849): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.35.206:22-4.153.228.146:41808 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:44.230869 kubelet[2905]: E0114 01:43:44.230818 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-77b7df9c9-vcm8x" podUID="afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11" Jan 14 01:43:46.088000 audit[5501]: NETFILTER_CFG table=filter:150 family=2 entries=26 op=nft_register_rule pid=5501 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:43:46.088000 audit[5501]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffdd3f56d0 a2=0 a3=1 items=0 ppid=3072 pid=5501 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:46.088000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:43:46.098000 audit[5501]: NETFILTER_CFG table=nat:151 family=2 entries=104 op=nft_register_chain pid=5501 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 01:43:46.098000 audit[5501]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=48684 a0=3 a1=ffffdd3f56d0 a2=0 a3=1 items=0 ppid=3072 pid=5501 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:46.098000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 01:43:46.230388 kubelet[2905]: E0114 01:43:46.230192 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" podUID="adf9db04-ef07-4e4b-ac7b-0a044973dca8" Jan 14 01:43:48.232467 kubelet[2905]: E0114 01:43:48.232256 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-jg7bs" podUID="e6d84f7a-c9f6-41d8-94ff-304c6e803e1e" Jan 14 01:43:48.861416 systemd[1]: Started sshd@19-10.0.35.206:22-4.153.228.146:37668.service - OpenSSH per-connection server daemon (4.153.228.146:37668). Jan 14 01:43:48.860000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.35.206:22-4.153.228.146:37668 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:48.866023 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 14 01:43:48.866129 kernel: audit: type=1130 audit(1768355028.860:852): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.35.206:22-4.153.228.146:37668 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:49.399000 audit[5503]: USER_ACCT pid=5503 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:49.400599 sshd[5503]: Accepted publickey for core from 4.153.228.146 port 37668 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:43:49.403285 sshd-session[5503]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:43:49.401000 audit[5503]: CRED_ACQ pid=5503 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:49.408455 kernel: audit: type=1101 audit(1768355029.399:853): pid=5503 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:49.408591 kernel: audit: type=1103 audit(1768355029.401:854): pid=5503 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:49.411168 kernel: audit: type=1006 audit(1768355029.401:855): pid=5503 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Jan 14 01:43:49.411307 kernel: audit: type=1300 audit(1768355029.401:855): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd692a8d0 a2=3 a3=0 items=0 ppid=1 pid=5503 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:49.401000 audit[5503]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd692a8d0 a2=3 a3=0 items=0 ppid=1 pid=5503 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:49.401000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:49.417974 kernel: audit: type=1327 audit(1768355029.401:855): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:49.418613 systemd-logind[1651]: New session 21 of user core. Jan 14 01:43:49.431900 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 14 01:43:49.433000 audit[5503]: USER_START pid=5503 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:49.439748 kernel: audit: type=1105 audit(1768355029.433:856): pid=5503 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:49.439835 kernel: audit: type=1103 audit(1768355029.438:857): pid=5507 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:49.438000 audit[5507]: CRED_ACQ pid=5507 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:49.774576 sshd[5507]: Connection closed by 4.153.228.146 port 37668 Jan 14 01:43:49.775429 sshd-session[5503]: pam_unix(sshd:session): session closed for user core Jan 14 01:43:49.776000 audit[5503]: USER_END pid=5503 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:49.782941 systemd[1]: sshd@19-10.0.35.206:22-4.153.228.146:37668.service: Deactivated successfully. Jan 14 01:43:49.776000 audit[5503]: CRED_DISP pid=5503 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:49.787203 kernel: audit: type=1106 audit(1768355029.776:858): pid=5503 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:49.787365 kernel: audit: type=1104 audit(1768355029.776:859): pid=5503 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:49.788196 systemd[1]: session-21.scope: Deactivated successfully. Jan 14 01:43:49.782000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.35.206:22-4.153.228.146:37668 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:49.788953 systemd-logind[1651]: Session 21 logged out. Waiting for processes to exit. Jan 14 01:43:49.790474 systemd-logind[1651]: Removed session 21. Jan 14 01:43:54.888013 systemd[1]: Started sshd@20-10.0.35.206:22-4.153.228.146:39802.service - OpenSSH per-connection server daemon (4.153.228.146:39802). Jan 14 01:43:54.887000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.35.206:22-4.153.228.146:39802 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:54.889185 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:43:54.889256 kernel: audit: type=1130 audit(1768355034.887:861): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.35.206:22-4.153.228.146:39802 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:55.432000 audit[5520]: USER_ACCT pid=5520 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:55.433449 sshd[5520]: Accepted publickey for core from 4.153.228.146 port 39802 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:43:55.437000 audit[5520]: CRED_ACQ pid=5520 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:55.438992 sshd-session[5520]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:43:55.441686 kernel: audit: type=1101 audit(1768355035.432:862): pid=5520 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:55.441764 kernel: audit: type=1103 audit(1768355035.437:863): pid=5520 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:55.443623 kernel: audit: type=1006 audit(1768355035.437:864): pid=5520 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Jan 14 01:43:55.443674 kernel: audit: type=1300 audit(1768355035.437:864): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd3f40210 a2=3 a3=0 items=0 ppid=1 pid=5520 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:55.437000 audit[5520]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd3f40210 a2=3 a3=0 items=0 ppid=1 pid=5520 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:43:55.446374 systemd-logind[1651]: New session 22 of user core. Jan 14 01:43:55.437000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:55.448399 kernel: audit: type=1327 audit(1768355035.437:864): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:43:55.450903 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 14 01:43:55.454000 audit[5520]: USER_START pid=5520 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:55.455000 audit[5524]: CRED_ACQ pid=5524 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:55.461699 kernel: audit: type=1105 audit(1768355035.454:865): pid=5520 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:55.461839 kernel: audit: type=1103 audit(1768355035.455:866): pid=5524 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:55.806331 sshd[5524]: Connection closed by 4.153.228.146 port 39802 Jan 14 01:43:55.805687 sshd-session[5520]: pam_unix(sshd:session): session closed for user core Jan 14 01:43:55.806000 audit[5520]: USER_END pid=5520 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:55.811048 systemd[1]: sshd@20-10.0.35.206:22-4.153.228.146:39802.service: Deactivated successfully. Jan 14 01:43:55.806000 audit[5520]: CRED_DISP pid=5520 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:55.812957 systemd[1]: session-22.scope: Deactivated successfully. Jan 14 01:43:55.813892 systemd-logind[1651]: Session 22 logged out. Waiting for processes to exit. Jan 14 01:43:55.814393 kernel: audit: type=1106 audit(1768355035.806:867): pid=5520 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:55.814839 kernel: audit: type=1104 audit(1768355035.806:868): pid=5520 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:43:55.811000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.35.206:22-4.153.228.146:39802 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:43:55.816490 systemd-logind[1651]: Removed session 22. Jan 14 01:43:56.230503 kubelet[2905]: E0114 01:43:56.230388 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-77b7df9c9-vcm8x" podUID="afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11" Jan 14 01:43:57.230388 kubelet[2905]: E0114 01:43:57.230326 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" podUID="8f23c871-1821-4e53-80e3-947513960a4b" Jan 14 01:43:58.230924 kubelet[2905]: E0114 01:43:58.230865 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" podUID="ac329ad1-edb3-4891-9a01-4d5e568d082e" Jan 14 01:43:58.231771 kubelet[2905]: E0114 01:43:58.231061 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:43:59.230673 kubelet[2905]: E0114 01:43:59.230447 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-jg7bs" podUID="e6d84f7a-c9f6-41d8-94ff-304c6e803e1e" Jan 14 01:44:00.925614 systemd[1]: Started sshd@21-10.0.35.206:22-4.153.228.146:39808.service - OpenSSH per-connection server daemon (4.153.228.146:39808). Jan 14 01:44:00.924000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.35.206:22-4.153.228.146:39808 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:44:00.929542 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:44:00.929599 kernel: audit: type=1130 audit(1768355040.924:870): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.35.206:22-4.153.228.146:39808 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:44:01.230707 kubelet[2905]: E0114 01:44:01.230498 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" podUID="adf9db04-ef07-4e4b-ac7b-0a044973dca8" Jan 14 01:44:01.482000 audit[5564]: USER_ACCT pid=5564 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:01.483442 sshd[5564]: Accepted publickey for core from 4.153.228.146 port 39808 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:44:01.487748 kernel: audit: type=1101 audit(1768355041.482:871): pid=5564 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:01.487000 audit[5564]: CRED_ACQ pid=5564 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:01.488638 sshd-session[5564]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:44:01.492897 kernel: audit: type=1103 audit(1768355041.487:872): pid=5564 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:01.492975 kernel: audit: type=1006 audit(1768355041.487:873): pid=5564 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Jan 14 01:44:01.492995 kernel: audit: type=1300 audit(1768355041.487:873): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffd5368c0 a2=3 a3=0 items=0 ppid=1 pid=5564 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:01.487000 audit[5564]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffd5368c0 a2=3 a3=0 items=0 ppid=1 pid=5564 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:01.494584 systemd-logind[1651]: New session 23 of user core. Jan 14 01:44:01.496228 kernel: audit: type=1327 audit(1768355041.487:873): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:44:01.487000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:44:01.503978 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 14 01:44:01.507000 audit[5564]: USER_START pid=5564 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:01.509000 audit[5568]: CRED_ACQ pid=5568 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:01.515472 kernel: audit: type=1105 audit(1768355041.507:874): pid=5564 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:01.515639 kernel: audit: type=1103 audit(1768355041.509:875): pid=5568 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:01.868641 sshd[5568]: Connection closed by 4.153.228.146 port 39808 Jan 14 01:44:01.868938 sshd-session[5564]: pam_unix(sshd:session): session closed for user core Jan 14 01:44:01.870000 audit[5564]: USER_END pid=5564 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:01.874128 systemd-logind[1651]: Session 23 logged out. Waiting for processes to exit. Jan 14 01:44:01.874760 systemd[1]: sshd@21-10.0.35.206:22-4.153.228.146:39808.service: Deactivated successfully. Jan 14 01:44:01.870000 audit[5564]: CRED_DISP pid=5564 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:01.877225 systemd[1]: session-23.scope: Deactivated successfully. Jan 14 01:44:01.878167 kernel: audit: type=1106 audit(1768355041.870:876): pid=5564 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:01.878218 kernel: audit: type=1104 audit(1768355041.870:877): pid=5564 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:01.879372 systemd-logind[1651]: Removed session 23. Jan 14 01:44:01.874000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.35.206:22-4.153.228.146:39808 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:44:06.984000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.35.206:22-4.153.228.146:44990 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:44:06.984690 systemd[1]: Started sshd@22-10.0.35.206:22-4.153.228.146:44990.service - OpenSSH per-connection server daemon (4.153.228.146:44990). Jan 14 01:44:06.988779 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:44:06.988835 kernel: audit: type=1130 audit(1768355046.984:879): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.35.206:22-4.153.228.146:44990 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:44:07.518000 audit[5583]: USER_ACCT pid=5583 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:07.521796 sshd[5583]: Accepted publickey for core from 4.153.228.146 port 44990 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:44:07.522756 kernel: audit: type=1101 audit(1768355047.518:880): pid=5583 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:07.523787 sshd-session[5583]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:44:07.522000 audit[5583]: CRED_ACQ pid=5583 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:07.529077 kernel: audit: type=1103 audit(1768355047.522:881): pid=5583 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:07.529145 kernel: audit: type=1006 audit(1768355047.522:882): pid=5583 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Jan 14 01:44:07.522000 audit[5583]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff4a2adc0 a2=3 a3=0 items=0 ppid=1 pid=5583 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:07.532959 kernel: audit: type=1300 audit(1768355047.522:882): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff4a2adc0 a2=3 a3=0 items=0 ppid=1 pid=5583 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:07.533303 kernel: audit: type=1327 audit(1768355047.522:882): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:44:07.522000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:44:07.537408 systemd-logind[1651]: New session 24 of user core. Jan 14 01:44:07.552134 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 14 01:44:07.553000 audit[5583]: USER_START pid=5583 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:07.558000 audit[5587]: CRED_ACQ pid=5587 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:07.562334 kernel: audit: type=1105 audit(1768355047.553:883): pid=5583 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:07.562438 kernel: audit: type=1103 audit(1768355047.558:884): pid=5587 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:07.875411 sshd[5587]: Connection closed by 4.153.228.146 port 44990 Jan 14 01:44:07.874543 sshd-session[5583]: pam_unix(sshd:session): session closed for user core Jan 14 01:44:07.875000 audit[5583]: USER_END pid=5583 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:07.879703 systemd[1]: sshd@22-10.0.35.206:22-4.153.228.146:44990.service: Deactivated successfully. Jan 14 01:44:07.876000 audit[5583]: CRED_DISP pid=5583 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:07.881631 systemd[1]: session-24.scope: Deactivated successfully. Jan 14 01:44:07.882386 systemd-logind[1651]: Session 24 logged out. Waiting for processes to exit. Jan 14 01:44:07.883975 kernel: audit: type=1106 audit(1768355047.875:885): pid=5583 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:07.884034 kernel: audit: type=1104 audit(1768355047.876:886): pid=5583 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:07.879000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.35.206:22-4.153.228.146:44990 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:44:07.884316 systemd-logind[1651]: Removed session 24. Jan 14 01:44:09.234463 kubelet[2905]: E0114 01:44:09.234404 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-77b7df9c9-vcm8x" podUID="afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11" Jan 14 01:44:10.229979 kubelet[2905]: E0114 01:44:10.229933 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-jg7bs" podUID="e6d84f7a-c9f6-41d8-94ff-304c6e803e1e" Jan 14 01:44:11.230514 kubelet[2905]: E0114 01:44:11.230457 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" podUID="ac329ad1-edb3-4891-9a01-4d5e568d082e" Jan 14 01:44:12.230749 kubelet[2905]: E0114 01:44:12.230696 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" podUID="8f23c871-1821-4e53-80e3-947513960a4b" Jan 14 01:44:12.983000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.35.206:22-4.153.228.146:45006 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:44:12.984199 systemd[1]: Started sshd@23-10.0.35.206:22-4.153.228.146:45006.service - OpenSSH per-connection server daemon (4.153.228.146:45006). Jan 14 01:44:12.985060 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:44:12.985108 kernel: audit: type=1130 audit(1768355052.983:888): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.35.206:22-4.153.228.146:45006 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:44:13.233166 kubelet[2905]: E0114 01:44:13.231826 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:44:13.538278 sshd[5600]: Accepted publickey for core from 4.153.228.146 port 45006 ssh2: RSA SHA256:9ArD8oY4wx9560KO5HF5eeU9U2GLIlUqUj7TFPIBzRc Jan 14 01:44:13.537000 audit[5600]: USER_ACCT pid=5600 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:13.541000 audit[5600]: CRED_ACQ pid=5600 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:13.542899 sshd-session[5600]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 01:44:13.545821 kernel: audit: type=1101 audit(1768355053.537:889): pid=5600 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:13.545984 kernel: audit: type=1103 audit(1768355053.541:890): pid=5600 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:13.546051 kernel: audit: type=1006 audit(1768355053.541:891): pid=5600 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Jan 14 01:44:13.541000 audit[5600]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffeae2b2d0 a2=3 a3=0 items=0 ppid=1 pid=5600 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:13.551517 kernel: audit: type=1300 audit(1768355053.541:891): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffeae2b2d0 a2=3 a3=0 items=0 ppid=1 pid=5600 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:13.541000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:44:13.552807 kernel: audit: type=1327 audit(1768355053.541:891): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 01:44:13.554675 systemd-logind[1651]: New session 25 of user core. Jan 14 01:44:13.563092 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 14 01:44:13.564000 audit[5600]: USER_START pid=5600 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:13.569764 kernel: audit: type=1105 audit(1768355053.564:892): pid=5600 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:13.569877 kernel: audit: type=1103 audit(1768355053.568:893): pid=5606 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:13.568000 audit[5606]: CRED_ACQ pid=5606 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:13.902231 sshd[5606]: Connection closed by 4.153.228.146 port 45006 Jan 14 01:44:13.901500 sshd-session[5600]: pam_unix(sshd:session): session closed for user core Jan 14 01:44:13.902000 audit[5600]: USER_END pid=5600 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:13.906039 systemd[1]: sshd@23-10.0.35.206:22-4.153.228.146:45006.service: Deactivated successfully. Jan 14 01:44:13.902000 audit[5600]: CRED_DISP pid=5600 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:13.909443 systemd[1]: session-25.scope: Deactivated successfully. Jan 14 01:44:13.910080 kernel: audit: type=1106 audit(1768355053.902:894): pid=5600 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:13.910204 kernel: audit: type=1104 audit(1768355053.902:895): pid=5600 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 01:44:13.905000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.35.206:22-4.153.228.146:45006 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 01:44:13.912272 systemd-logind[1651]: Session 25 logged out. Waiting for processes to exit. Jan 14 01:44:13.916895 systemd-logind[1651]: Removed session 25. Jan 14 01:44:15.232107 kubelet[2905]: E0114 01:44:15.231532 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" podUID="adf9db04-ef07-4e4b-ac7b-0a044973dca8" Jan 14 01:44:20.230244 kubelet[2905]: E0114 01:44:20.230158 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-77b7df9c9-vcm8x" podUID="afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11" Jan 14 01:44:22.230024 kubelet[2905]: E0114 01:44:22.229865 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-jg7bs" podUID="e6d84f7a-c9f6-41d8-94ff-304c6e803e1e" Jan 14 01:44:23.230664 kubelet[2905]: E0114 01:44:23.230619 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" podUID="8f23c871-1821-4e53-80e3-947513960a4b" Jan 14 01:44:25.231093 kubelet[2905]: E0114 01:44:25.231047 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" podUID="ac329ad1-edb3-4891-9a01-4d5e568d082e" Jan 14 01:44:26.230157 kubelet[2905]: E0114 01:44:26.229911 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" podUID="adf9db04-ef07-4e4b-ac7b-0a044973dca8" Jan 14 01:44:27.231345 kubelet[2905]: E0114 01:44:27.231275 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:44:33.230348 kubelet[2905]: E0114 01:44:33.230269 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-77b7df9c9-vcm8x" podUID="afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11" Jan 14 01:44:35.230624 kubelet[2905]: E0114 01:44:35.230546 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-jg7bs" podUID="e6d84f7a-c9f6-41d8-94ff-304c6e803e1e" Jan 14 01:44:35.724394 systemd[1]: cri-containerd-b6ab4df23c040366dc69c84b2a8511db28ca7078235904ae9790ea5b1cb8081a.scope: Deactivated successfully. Jan 14 01:44:35.725545 containerd[1676]: time="2026-01-14T01:44:35.725508118Z" level=info msg="received container exit event container_id:\"b6ab4df23c040366dc69c84b2a8511db28ca7078235904ae9790ea5b1cb8081a\" id:\"b6ab4df23c040366dc69c84b2a8511db28ca7078235904ae9790ea5b1cb8081a\" pid:3330 exit_status:1 exited_at:{seconds:1768355075 nanos:725269317}" Jan 14 01:44:35.725845 systemd[1]: cri-containerd-b6ab4df23c040366dc69c84b2a8511db28ca7078235904ae9790ea5b1cb8081a.scope: Consumed 37.083s CPU time, 102.7M memory peak. Jan 14 01:44:35.729000 audit: BPF prog-id=151 op=UNLOAD Jan 14 01:44:35.731644 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 01:44:35.731977 kernel: audit: type=1334 audit(1768355075.729:897): prog-id=151 op=UNLOAD Jan 14 01:44:35.732021 kernel: audit: type=1334 audit(1768355075.729:898): prog-id=155 op=UNLOAD Jan 14 01:44:35.729000 audit: BPF prog-id=155 op=UNLOAD Jan 14 01:44:35.750848 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b6ab4df23c040366dc69c84b2a8511db28ca7078235904ae9790ea5b1cb8081a-rootfs.mount: Deactivated successfully. Jan 14 01:44:36.165034 kubelet[2905]: E0114 01:44:36.164990 2905 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.35.206:35070->10.0.35.230:2379: read: connection timed out" Jan 14 01:44:36.229619 kubelet[2905]: E0114 01:44:36.229571 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" podUID="8f23c871-1821-4e53-80e3-947513960a4b" Jan 14 01:44:36.677277 kubelet[2905]: E0114 01:44:36.677141 2905 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.35.206:34868->10.0.35.230:2379: read: connection timed out" event="&Event{ObjectMeta:{calico-apiserver-675898b8d4-dphrr.188a756d274ef509 calico-apiserver 1720 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-apiserver,Name:calico-apiserver-675898b8d4-dphrr,UID:adf9db04-ef07-4e4b-ac7b-0a044973dca8,APIVersion:v1,ResourceVersion:811,FieldPath:spec.containers{calico-apiserver},},Reason:BackOff,Message:Back-off pulling image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\",Source:EventSource{Component:kubelet,Host:ci-4578-0-0-p-96753e66ce,},FirstTimestamp:2026-01-14 01:41:55 +0000 UTC,LastTimestamp:2026-01-14 01:44:26.229696939 +0000 UTC m=+201.103183014,Count:10,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4578-0-0-p-96753e66ce,}" Jan 14 01:44:36.720362 kubelet[2905]: I0114 01:44:36.720332 2905 scope.go:117] "RemoveContainer" containerID="bcb9dc991096a1ad1791f0f1ebf07bbe7d362032306ddb8926c42c7986f444f5" Jan 14 01:44:36.720676 kubelet[2905]: I0114 01:44:36.720643 2905 scope.go:117] "RemoveContainer" containerID="b6ab4df23c040366dc69c84b2a8511db28ca7078235904ae9790ea5b1cb8081a" Jan 14 01:44:36.720846 kubelet[2905]: E0114 01:44:36.720822 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7dcd859c48-qllzk_tigera-operator(771e9332-cf06-40d6-9db7-e0ca09e8572b)\"" pod="tigera-operator/tigera-operator-7dcd859c48-qllzk" podUID="771e9332-cf06-40d6-9db7-e0ca09e8572b" Jan 14 01:44:36.722019 containerd[1676]: time="2026-01-14T01:44:36.721987857Z" level=info msg="RemoveContainer for \"bcb9dc991096a1ad1791f0f1ebf07bbe7d362032306ddb8926c42c7986f444f5\"" Jan 14 01:44:36.727327 containerd[1676]: time="2026-01-14T01:44:36.727295671Z" level=info msg="RemoveContainer for \"bcb9dc991096a1ad1791f0f1ebf07bbe7d362032306ddb8926c42c7986f444f5\" returns successfully" Jan 14 01:44:37.105744 systemd[1]: cri-containerd-435a536ddb1c0891d0028b96ab5116258cd5b16a5fd47ce952a83eea92b1f4ab.scope: Deactivated successfully. Jan 14 01:44:37.106411 systemd[1]: cri-containerd-435a536ddb1c0891d0028b96ab5116258cd5b16a5fd47ce952a83eea92b1f4ab.scope: Consumed 3.830s CPU time, 59.9M memory peak. Jan 14 01:44:37.106000 audit: BPF prog-id=261 op=LOAD Jan 14 01:44:37.108827 containerd[1676]: time="2026-01-14T01:44:37.108785107Z" level=info msg="received container exit event container_id:\"435a536ddb1c0891d0028b96ab5116258cd5b16a5fd47ce952a83eea92b1f4ab\" id:\"435a536ddb1c0891d0028b96ab5116258cd5b16a5fd47ce952a83eea92b1f4ab\" pid:2752 exit_status:1 exited_at:{seconds:1768355077 nanos:106907183}" Jan 14 01:44:37.106000 audit: BPF prog-id=87 op=UNLOAD Jan 14 01:44:37.110107 kernel: audit: type=1334 audit(1768355077.106:899): prog-id=261 op=LOAD Jan 14 01:44:37.110240 kernel: audit: type=1334 audit(1768355077.106:900): prog-id=87 op=UNLOAD Jan 14 01:44:37.114000 audit: BPF prog-id=103 op=UNLOAD Jan 14 01:44:37.114000 audit: BPF prog-id=107 op=UNLOAD Jan 14 01:44:37.116963 kernel: audit: type=1334 audit(1768355077.114:901): prog-id=103 op=UNLOAD Jan 14 01:44:37.117026 kernel: audit: type=1334 audit(1768355077.114:902): prog-id=107 op=UNLOAD Jan 14 01:44:37.131364 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-435a536ddb1c0891d0028b96ab5116258cd5b16a5fd47ce952a83eea92b1f4ab-rootfs.mount: Deactivated successfully. Jan 14 01:44:37.229953 kubelet[2905]: E0114 01:44:37.229871 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" podUID="adf9db04-ef07-4e4b-ac7b-0a044973dca8" Jan 14 01:44:37.726782 kubelet[2905]: I0114 01:44:37.726753 2905 scope.go:117] "RemoveContainer" containerID="435a536ddb1c0891d0028b96ab5116258cd5b16a5fd47ce952a83eea92b1f4ab" Jan 14 01:44:37.728506 containerd[1676]: time="2026-01-14T01:44:37.728463182Z" level=info msg="CreateContainer within sandbox \"9ec47dfaa26e672adb481f36a5b1308fff991f9cbc0940c99e5398bb264d5293\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 14 01:44:37.736745 containerd[1676]: time="2026-01-14T01:44:37.735430359Z" level=info msg="Container 653c807d6abfa6ce77c753b4df9313fad0df1aebb75929cf49dda16d8c5748d3: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:44:37.746723 containerd[1676]: time="2026-01-14T01:44:37.746656867Z" level=info msg="CreateContainer within sandbox \"9ec47dfaa26e672adb481f36a5b1308fff991f9cbc0940c99e5398bb264d5293\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"653c807d6abfa6ce77c753b4df9313fad0df1aebb75929cf49dda16d8c5748d3\"" Jan 14 01:44:37.747157 containerd[1676]: time="2026-01-14T01:44:37.747128109Z" level=info msg="StartContainer for \"653c807d6abfa6ce77c753b4df9313fad0df1aebb75929cf49dda16d8c5748d3\"" Jan 14 01:44:37.748341 containerd[1676]: time="2026-01-14T01:44:37.748306312Z" level=info msg="connecting to shim 653c807d6abfa6ce77c753b4df9313fad0df1aebb75929cf49dda16d8c5748d3" address="unix:///run/containerd/s/9823c418787b7867489d8c9262c14b4a6853a297804addbb33cf3fda1c1f7584" protocol=ttrpc version=3 Jan 14 01:44:37.776062 systemd[1]: Started cri-containerd-653c807d6abfa6ce77c753b4df9313fad0df1aebb75929cf49dda16d8c5748d3.scope - libcontainer container 653c807d6abfa6ce77c753b4df9313fad0df1aebb75929cf49dda16d8c5748d3. Jan 14 01:44:37.787000 audit: BPF prog-id=262 op=LOAD Jan 14 01:44:37.789799 kernel: audit: type=1334 audit(1768355077.787:903): prog-id=262 op=LOAD Jan 14 01:44:37.789942 kernel: audit: type=1334 audit(1768355077.789:904): prog-id=263 op=LOAD Jan 14 01:44:37.789000 audit: BPF prog-id=263 op=LOAD Jan 14 01:44:37.789000 audit[5678]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2599 pid=5678 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:37.794927 kernel: audit: type=1300 audit(1768355077.789:904): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2599 pid=5678 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:37.795051 kernel: audit: type=1327 audit(1768355077.789:904): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3635336338303764366162666136636537376337353362346466393331 Jan 14 01:44:37.789000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3635336338303764366162666136636537376337353362346466393331 Jan 14 01:44:37.789000 audit: BPF prog-id=263 op=UNLOAD Jan 14 01:44:37.789000 audit[5678]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=5678 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:37.789000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3635336338303764366162666136636537376337353362346466393331 Jan 14 01:44:37.789000 audit: BPF prog-id=264 op=LOAD Jan 14 01:44:37.789000 audit[5678]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2599 pid=5678 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:37.789000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3635336338303764366162666136636537376337353362346466393331 Jan 14 01:44:37.790000 audit: BPF prog-id=265 op=LOAD Jan 14 01:44:37.790000 audit[5678]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2599 pid=5678 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:37.790000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3635336338303764366162666136636537376337353362346466393331 Jan 14 01:44:37.794000 audit: BPF prog-id=265 op=UNLOAD Jan 14 01:44:37.794000 audit[5678]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=5678 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:37.794000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3635336338303764366162666136636537376337353362346466393331 Jan 14 01:44:37.794000 audit: BPF prog-id=264 op=UNLOAD Jan 14 01:44:37.794000 audit[5678]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2599 pid=5678 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:37.794000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3635336338303764366162666136636537376337353362346466393331 Jan 14 01:44:37.794000 audit: BPF prog-id=266 op=LOAD Jan 14 01:44:37.794000 audit[5678]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2599 pid=5678 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:37.794000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3635336338303764366162666136636537376337353362346466393331 Jan 14 01:44:37.823101 containerd[1676]: time="2026-01-14T01:44:37.823064219Z" level=info msg="StartContainer for \"653c807d6abfa6ce77c753b4df9313fad0df1aebb75929cf49dda16d8c5748d3\" returns successfully" Jan 14 01:44:38.229651 containerd[1676]: time="2026-01-14T01:44:38.229614079Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:44:38.557508 containerd[1676]: time="2026-01-14T01:44:38.557191861Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:44:38.558807 containerd[1676]: time="2026-01-14T01:44:38.558775784Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:44:38.558978 containerd[1676]: time="2026-01-14T01:44:38.558832665Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:44:38.559140 kubelet[2905]: E0114 01:44:38.559103 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:44:38.559190 kubelet[2905]: E0114 01:44:38.559157 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:44:38.559335 kubelet[2905]: E0114 01:44:38.559276 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9blv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-675898b8d4-fz6xg_calico-apiserver(ac329ad1-edb3-4891-9a01-4d5e568d082e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:44:38.560482 kubelet[2905]: E0114 01:44:38.560438 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" podUID="ac329ad1-edb3-4891-9a01-4d5e568d082e" Jan 14 01:44:40.230785 containerd[1676]: time="2026-01-14T01:44:40.230489658Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 01:44:40.551333 containerd[1676]: time="2026-01-14T01:44:40.551246542Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:44:40.552738 containerd[1676]: time="2026-01-14T01:44:40.552637866Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 01:44:40.552738 containerd[1676]: time="2026-01-14T01:44:40.552687706Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 01:44:40.552978 kubelet[2905]: E0114 01:44:40.552917 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:44:40.552978 kubelet[2905]: E0114 01:44:40.552968 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 01:44:40.553383 kubelet[2905]: E0114 01:44:40.553090 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bz2bm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-twlzn_calico-system(1b8220b4-811d-4471-95d7-cea88df93438): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 01:44:40.555242 containerd[1676]: time="2026-01-14T01:44:40.555215632Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 01:44:40.886578 containerd[1676]: time="2026-01-14T01:44:40.886414943Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:44:40.887906 containerd[1676]: time="2026-01-14T01:44:40.887866467Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 01:44:40.888005 containerd[1676]: time="2026-01-14T01:44:40.887888947Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 01:44:40.888119 kubelet[2905]: E0114 01:44:40.888086 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:44:40.888197 kubelet[2905]: E0114 01:44:40.888131 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 01:44:40.888281 kubelet[2905]: E0114 01:44:40.888244 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bz2bm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-twlzn_calico-system(1b8220b4-811d-4471-95d7-cea88df93438): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 01:44:40.889447 kubelet[2905]: E0114 01:44:40.889412 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:44:41.838640 systemd[1]: cri-containerd-4ef4cffedf687844368f629c4f4c88098c02e6880c283f9a4d77dad7ebaa3749.scope: Deactivated successfully. Jan 14 01:44:41.839086 systemd[1]: cri-containerd-4ef4cffedf687844368f629c4f4c88098c02e6880c283f9a4d77dad7ebaa3749.scope: Consumed 4.809s CPU time, 24.2M memory peak. Jan 14 01:44:41.839000 audit: BPF prog-id=267 op=LOAD Jan 14 01:44:41.840885 containerd[1676]: time="2026-01-14T01:44:41.840838177Z" level=info msg="received container exit event container_id:\"4ef4cffedf687844368f629c4f4c88098c02e6880c283f9a4d77dad7ebaa3749\" id:\"4ef4cffedf687844368f629c4f4c88098c02e6880c283f9a4d77dad7ebaa3749\" pid:2738 exit_status:1 exited_at:{seconds:1768355081 nanos:840512616}" Jan 14 01:44:41.841458 kernel: kauditd_printk_skb: 18 callbacks suppressed Jan 14 01:44:41.841546 kernel: audit: type=1334 audit(1768355081.839:911): prog-id=267 op=LOAD Jan 14 01:44:41.839000 audit: BPF prog-id=85 op=UNLOAD Jan 14 01:44:41.842744 kernel: audit: type=1334 audit(1768355081.839:912): prog-id=85 op=UNLOAD Jan 14 01:44:41.844000 audit: BPF prog-id=98 op=UNLOAD Jan 14 01:44:41.844000 audit: BPF prog-id=102 op=UNLOAD Jan 14 01:44:41.846572 kernel: audit: type=1334 audit(1768355081.844:913): prog-id=98 op=UNLOAD Jan 14 01:44:41.846630 kernel: audit: type=1334 audit(1768355081.844:914): prog-id=102 op=UNLOAD Jan 14 01:44:41.861768 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4ef4cffedf687844368f629c4f4c88098c02e6880c283f9a4d77dad7ebaa3749-rootfs.mount: Deactivated successfully. Jan 14 01:44:42.740421 kubelet[2905]: I0114 01:44:42.740373 2905 scope.go:117] "RemoveContainer" containerID="4ef4cffedf687844368f629c4f4c88098c02e6880c283f9a4d77dad7ebaa3749" Jan 14 01:44:42.742102 containerd[1676]: time="2026-01-14T01:44:42.742069078Z" level=info msg="CreateContainer within sandbox \"169ccaf4eb92b1321042234eed861ab0b1a7c692193a1e178e62955842d3a3cd\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jan 14 01:44:42.750761 containerd[1676]: time="2026-01-14T01:44:42.750128658Z" level=info msg="Container 7cdb9a79e0b6e569bd8a12229d5e645caf0cd24de6d5c721c0c0cd1adb9fe149: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:44:42.758107 containerd[1676]: time="2026-01-14T01:44:42.758068038Z" level=info msg="CreateContainer within sandbox \"169ccaf4eb92b1321042234eed861ab0b1a7c692193a1e178e62955842d3a3cd\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"7cdb9a79e0b6e569bd8a12229d5e645caf0cd24de6d5c721c0c0cd1adb9fe149\"" Jan 14 01:44:42.758730 containerd[1676]: time="2026-01-14T01:44:42.758698599Z" level=info msg="StartContainer for \"7cdb9a79e0b6e569bd8a12229d5e645caf0cd24de6d5c721c0c0cd1adb9fe149\"" Jan 14 01:44:42.759923 containerd[1676]: time="2026-01-14T01:44:42.759891562Z" level=info msg="connecting to shim 7cdb9a79e0b6e569bd8a12229d5e645caf0cd24de6d5c721c0c0cd1adb9fe149" address="unix:///run/containerd/s/a3fd9a60785d65bc1160d748ff9292b3e776e5ff29585cce16ccef6e3eb2f1af" protocol=ttrpc version=3 Jan 14 01:44:42.778906 systemd[1]: Started cri-containerd-7cdb9a79e0b6e569bd8a12229d5e645caf0cd24de6d5c721c0c0cd1adb9fe149.scope - libcontainer container 7cdb9a79e0b6e569bd8a12229d5e645caf0cd24de6d5c721c0c0cd1adb9fe149. Jan 14 01:44:42.788000 audit: BPF prog-id=268 op=LOAD Jan 14 01:44:42.790743 kernel: audit: type=1334 audit(1768355082.788:915): prog-id=268 op=LOAD Jan 14 01:44:42.790000 audit: BPF prog-id=269 op=LOAD Jan 14 01:44:42.790000 audit[5722]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2604 pid=5722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:42.795352 kernel: audit: type=1334 audit(1768355082.790:916): prog-id=269 op=LOAD Jan 14 01:44:42.795406 kernel: audit: type=1300 audit(1768355082.790:916): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2604 pid=5722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:42.795436 kernel: audit: type=1327 audit(1768355082.790:916): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763646239613739653062366535363962643861313232323964356536 Jan 14 01:44:42.790000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763646239613739653062366535363962643861313232323964356536 Jan 14 01:44:42.798936 kernel: audit: type=1334 audit(1768355082.790:917): prog-id=269 op=UNLOAD Jan 14 01:44:42.790000 audit: BPF prog-id=269 op=UNLOAD Jan 14 01:44:42.790000 audit[5722]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2604 pid=5722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:42.803105 kernel: audit: type=1300 audit(1768355082.790:917): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2604 pid=5722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:42.790000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763646239613739653062366535363962643861313232323964356536 Jan 14 01:44:42.790000 audit: BPF prog-id=270 op=LOAD Jan 14 01:44:42.790000 audit[5722]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2604 pid=5722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:42.790000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763646239613739653062366535363962643861313232323964356536 Jan 14 01:44:42.791000 audit: BPF prog-id=271 op=LOAD Jan 14 01:44:42.791000 audit[5722]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2604 pid=5722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:42.791000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763646239613739653062366535363962643861313232323964356536 Jan 14 01:44:42.794000 audit: BPF prog-id=271 op=UNLOAD Jan 14 01:44:42.794000 audit[5722]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2604 pid=5722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:42.794000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763646239613739653062366535363962643861313232323964356536 Jan 14 01:44:42.794000 audit: BPF prog-id=270 op=UNLOAD Jan 14 01:44:42.794000 audit[5722]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2604 pid=5722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:42.794000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763646239613739653062366535363962643861313232323964356536 Jan 14 01:44:42.794000 audit: BPF prog-id=272 op=LOAD Jan 14 01:44:42.794000 audit[5722]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2604 pid=5722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:42.794000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3763646239613739653062366535363962643861313232323964356536 Jan 14 01:44:42.828410 containerd[1676]: time="2026-01-14T01:44:42.828371774Z" level=info msg="StartContainer for \"7cdb9a79e0b6e569bd8a12229d5e645caf0cd24de6d5c721c0c0cd1adb9fe149\" returns successfully" Jan 14 01:44:44.420980 kubelet[2905]: I0114 01:44:44.420866 2905 status_manager.go:895] "Failed to get status for pod" podUID="771e9332-cf06-40d6-9db7-e0ca09e8572b" pod="tigera-operator/tigera-operator-7dcd859c48-qllzk" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.35.206:34982->10.0.35.230:2379: read: connection timed out" Jan 14 01:44:46.165989 kubelet[2905]: E0114 01:44:46.165923 2905 controller.go:195] "Failed to update lease" err="Put \"https://10.0.35.206:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4578-0-0-p-96753e66ce?timeout=10s\": context deadline exceeded" Jan 14 01:44:46.229927 kubelet[2905]: E0114 01:44:46.229857 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-jg7bs" podUID="e6d84f7a-c9f6-41d8-94ff-304c6e803e1e" Jan 14 01:44:46.230064 containerd[1676]: time="2026-01-14T01:44:46.230028827Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 01:44:46.558867 containerd[1676]: time="2026-01-14T01:44:46.558808771Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:44:46.560276 containerd[1676]: time="2026-01-14T01:44:46.560228975Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 01:44:46.560365 containerd[1676]: time="2026-01-14T01:44:46.560309615Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 01:44:46.560528 kubelet[2905]: E0114 01:44:46.560460 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:44:46.560528 kubelet[2905]: E0114 01:44:46.560521 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 01:44:46.560672 kubelet[2905]: E0114 01:44:46.560639 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1bdf15913aa7461abab7765f5b689915,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gtqjc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-77b7df9c9-vcm8x_calico-system(afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 01:44:46.562924 containerd[1676]: time="2026-01-14T01:44:46.562897541Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 01:44:46.893082 containerd[1676]: time="2026-01-14T01:44:46.892884889Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:44:46.894708 containerd[1676]: time="2026-01-14T01:44:46.894668934Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 01:44:46.894788 containerd[1676]: time="2026-01-14T01:44:46.894743334Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 01:44:46.894961 kubelet[2905]: E0114 01:44:46.894911 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:44:46.895023 kubelet[2905]: E0114 01:44:46.894962 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 01:44:46.895150 kubelet[2905]: E0114 01:44:46.895088 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gtqjc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-77b7df9c9-vcm8x_calico-system(afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 01:44:46.896317 kubelet[2905]: E0114 01:44:46.896267 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-77b7df9c9-vcm8x" podUID="afb8f8b9-3d6e-46a2-b7ae-ef74d2aa4a11" Jan 14 01:44:49.229190 kubelet[2905]: I0114 01:44:49.229138 2905 scope.go:117] "RemoveContainer" containerID="b6ab4df23c040366dc69c84b2a8511db28ca7078235904ae9790ea5b1cb8081a" Jan 14 01:44:49.232250 containerd[1676]: time="2026-01-14T01:44:49.231969756Z" level=info msg="CreateContainer within sandbox \"edcc7f32f2e0ed11b4938dde7cada78d8a3ed5c68b3af018761b4aa2009ba1c4\" for container &ContainerMetadata{Name:tigera-operator,Attempt:2,}" Jan 14 01:44:49.239568 containerd[1676]: time="2026-01-14T01:44:49.239523775Z" level=info msg="Container 11881cfc67d7822105a9dd357c32e1610eabb630f8735653c719360a37e03568: CDI devices from CRI Config.CDIDevices: []" Jan 14 01:44:49.246876 containerd[1676]: time="2026-01-14T01:44:49.246831794Z" level=info msg="CreateContainer within sandbox \"edcc7f32f2e0ed11b4938dde7cada78d8a3ed5c68b3af018761b4aa2009ba1c4\" for &ContainerMetadata{Name:tigera-operator,Attempt:2,} returns container id \"11881cfc67d7822105a9dd357c32e1610eabb630f8735653c719360a37e03568\"" Jan 14 01:44:49.247320 containerd[1676]: time="2026-01-14T01:44:49.247298435Z" level=info msg="StartContainer for \"11881cfc67d7822105a9dd357c32e1610eabb630f8735653c719360a37e03568\"" Jan 14 01:44:49.248541 containerd[1676]: time="2026-01-14T01:44:49.248515638Z" level=info msg="connecting to shim 11881cfc67d7822105a9dd357c32e1610eabb630f8735653c719360a37e03568" address="unix:///run/containerd/s/588911ba48772895c681676704d18d1534961b186f9fc1d625b50572b007f43d" protocol=ttrpc version=3 Jan 14 01:44:49.269896 systemd[1]: Started cri-containerd-11881cfc67d7822105a9dd357c32e1610eabb630f8735653c719360a37e03568.scope - libcontainer container 11881cfc67d7822105a9dd357c32e1610eabb630f8735653c719360a37e03568. Jan 14 01:44:49.279000 audit: BPF prog-id=273 op=LOAD Jan 14 01:44:49.280758 kernel: kauditd_printk_skb: 16 callbacks suppressed Jan 14 01:44:49.280809 kernel: audit: type=1334 audit(1768355089.279:923): prog-id=273 op=LOAD Jan 14 01:44:49.281000 audit: BPF prog-id=274 op=LOAD Jan 14 01:44:49.281000 audit[5761]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3029 pid=5761 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:49.286368 kernel: audit: type=1334 audit(1768355089.281:924): prog-id=274 op=LOAD Jan 14 01:44:49.286404 kernel: audit: type=1300 audit(1768355089.281:924): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3029 pid=5761 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:49.286432 kernel: audit: type=1327 audit(1768355089.281:924): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131383831636663363764373832323130356139646433353763333265 Jan 14 01:44:49.281000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131383831636663363764373832323130356139646433353763333265 Jan 14 01:44:49.281000 audit: BPF prog-id=274 op=UNLOAD Jan 14 01:44:49.290753 kernel: audit: type=1334 audit(1768355089.281:925): prog-id=274 op=UNLOAD Jan 14 01:44:49.290795 kernel: audit: type=1300 audit(1768355089.281:925): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3029 pid=5761 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:49.281000 audit[5761]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3029 pid=5761 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:49.281000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131383831636663363764373832323130356139646433353763333265 Jan 14 01:44:49.298210 kernel: audit: type=1327 audit(1768355089.281:925): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131383831636663363764373832323130356139646433353763333265 Jan 14 01:44:49.281000 audit: BPF prog-id=275 op=LOAD Jan 14 01:44:49.299420 kernel: audit: type=1334 audit(1768355089.281:926): prog-id=275 op=LOAD Jan 14 01:44:49.299517 kernel: audit: type=1300 audit(1768355089.281:926): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3029 pid=5761 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:49.281000 audit[5761]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3029 pid=5761 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:49.281000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131383831636663363764373832323130356139646433353763333265 Jan 14 01:44:49.306516 kernel: audit: type=1327 audit(1768355089.281:926): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131383831636663363764373832323130356139646433353763333265 Jan 14 01:44:49.282000 audit: BPF prog-id=276 op=LOAD Jan 14 01:44:49.282000 audit[5761]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3029 pid=5761 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:49.282000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131383831636663363764373832323130356139646433353763333265 Jan 14 01:44:49.285000 audit: BPF prog-id=276 op=UNLOAD Jan 14 01:44:49.285000 audit[5761]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3029 pid=5761 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:49.285000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131383831636663363764373832323130356139646433353763333265 Jan 14 01:44:49.285000 audit: BPF prog-id=275 op=UNLOAD Jan 14 01:44:49.285000 audit[5761]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3029 pid=5761 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:49.285000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131383831636663363764373832323130356139646433353763333265 Jan 14 01:44:49.285000 audit: BPF prog-id=277 op=LOAD Jan 14 01:44:49.285000 audit[5761]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3029 pid=5761 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 01:44:49.285000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131383831636663363764373832323130356139646433353763333265 Jan 14 01:44:49.322018 containerd[1676]: time="2026-01-14T01:44:49.321981302Z" level=info msg="StartContainer for \"11881cfc67d7822105a9dd357c32e1610eabb630f8735653c719360a37e03568\" returns successfully" Jan 14 01:44:50.230506 kubelet[2905]: E0114 01:44:50.230413 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b67dd7dbc-k5lcl" podUID="8f23c871-1821-4e53-80e3-947513960a4b" Jan 14 01:44:52.230374 containerd[1676]: time="2026-01-14T01:44:52.230227837Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 01:44:52.547411 containerd[1676]: time="2026-01-14T01:44:52.547341432Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 01:44:52.549972 containerd[1676]: time="2026-01-14T01:44:52.549933839Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 01:44:52.550053 containerd[1676]: time="2026-01-14T01:44:52.550008639Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 01:44:52.550182 kubelet[2905]: E0114 01:44:52.550134 2905 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:44:52.550444 kubelet[2905]: E0114 01:44:52.550193 2905 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 01:44:52.550444 kubelet[2905]: E0114 01:44:52.550313 2905 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fjtvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-675898b8d4-dphrr_calico-apiserver(adf9db04-ef07-4e4b-ac7b-0a044973dca8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 01:44:52.551566 kubelet[2905]: E0114 01:44:52.551536 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-dphrr" podUID="adf9db04-ef07-4e4b-ac7b-0a044973dca8" Jan 14 01:44:53.231404 kubelet[2905]: E0114 01:44:53.230459 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-675898b8d4-fz6xg" podUID="ac329ad1-edb3-4891-9a01-4d5e568d082e" Jan 14 01:44:55.230255 kubelet[2905]: E0114 01:44:55.230205 2905 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-twlzn" podUID="1b8220b4-811d-4471-95d7-cea88df93438" Jan 14 01:44:56.167133 kubelet[2905]: E0114 01:44:56.166877 2905 controller.go:195] "Failed to update lease" err="Put \"https://10.0.35.206:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4578-0-0-p-96753e66ce?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 14 01:44:58.353767 kernel: pcieport 0000:00:01.0: pciehp: Slot(0): Button press: will power off in 5 sec