Jan 28 00:05:33.494796 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jan 28 00:05:33.494819 kernel: Linux version 6.12.66-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT Tue Jan 27 22:20:26 -00 2026 Jan 28 00:05:33.494829 kernel: KASLR enabled Jan 28 00:05:33.494835 kernel: efi: EFI v2.7 by EDK II Jan 28 00:05:33.494841 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438357218 Jan 28 00:05:33.494846 kernel: random: crng init done Jan 28 00:05:33.494853 kernel: secureboot: Secure boot disabled Jan 28 00:05:33.494859 kernel: ACPI: Early table checksum verification disabled Jan 28 00:05:33.494865 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Jan 28 00:05:33.494873 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Jan 28 00:05:33.494879 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jan 28 00:05:33.494885 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 28 00:05:33.494891 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jan 28 00:05:33.494897 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 28 00:05:33.494906 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 28 00:05:33.494913 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 28 00:05:33.494919 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 28 00:05:33.494926 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jan 28 00:05:33.494932 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 28 00:05:33.494939 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 28 00:05:33.494945 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Jan 28 00:05:33.494952 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jan 28 00:05:33.494958 kernel: ACPI: Use ACPI SPCR as default console: Yes Jan 28 00:05:33.494966 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Jan 28 00:05:33.494972 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Jan 28 00:05:33.494979 kernel: Zone ranges: Jan 28 00:05:33.494985 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jan 28 00:05:33.494991 kernel: DMA32 empty Jan 28 00:05:33.494998 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Jan 28 00:05:33.495004 kernel: Device empty Jan 28 00:05:33.495010 kernel: Movable zone start for each node Jan 28 00:05:33.495017 kernel: Early memory node ranges Jan 28 00:05:33.495023 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Jan 28 00:05:33.495030 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Jan 28 00:05:33.495036 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Jan 28 00:05:33.495043 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Jan 28 00:05:33.495050 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Jan 28 00:05:33.495056 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Jan 28 00:05:33.495063 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Jan 28 00:05:33.495069 kernel: psci: probing for conduit method from ACPI. Jan 28 00:05:33.495078 kernel: psci: PSCIv1.3 detected in firmware. Jan 28 00:05:33.495086 kernel: psci: Using standard PSCI v0.2 function IDs Jan 28 00:05:33.495093 kernel: psci: Trusted OS migration not required Jan 28 00:05:33.495100 kernel: psci: SMC Calling Convention v1.1 Jan 28 00:05:33.495107 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jan 28 00:05:33.495114 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Jan 28 00:05:33.495121 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Jan 28 00:05:33.495127 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Jan 28 00:05:33.495134 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Jan 28 00:05:33.495142 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jan 28 00:05:33.495149 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jan 28 00:05:33.495156 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Jan 28 00:05:33.495163 kernel: Detected PIPT I-cache on CPU0 Jan 28 00:05:33.495169 kernel: CPU features: detected: GIC system register CPU interface Jan 28 00:05:33.495176 kernel: CPU features: detected: Spectre-v4 Jan 28 00:05:33.495183 kernel: CPU features: detected: Spectre-BHB Jan 28 00:05:33.495189 kernel: CPU features: kernel page table isolation forced ON by KASLR Jan 28 00:05:33.495196 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jan 28 00:05:33.495203 kernel: CPU features: detected: ARM erratum 1418040 Jan 28 00:05:33.495223 kernel: CPU features: detected: SSBS not fully self-synchronizing Jan 28 00:05:33.495232 kernel: alternatives: applying boot alternatives Jan 28 00:05:33.495239 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=880c7a57ca1a4cf41361128ef304e12abcda0ba85f8697ad932e9820a1865169 Jan 28 00:05:33.495247 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Jan 28 00:05:33.495254 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Jan 28 00:05:33.495261 kernel: Fallback order for Node 0: 0 Jan 28 00:05:33.495267 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Jan 28 00:05:33.495274 kernel: Policy zone: Normal Jan 28 00:05:33.495281 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 28 00:05:33.495288 kernel: software IO TLB: area num 4. Jan 28 00:05:33.495295 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Jan 28 00:05:33.495303 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Jan 28 00:05:33.495310 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 28 00:05:33.495317 kernel: rcu: RCU event tracing is enabled. Jan 28 00:05:33.495324 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Jan 28 00:05:33.495331 kernel: Trampoline variant of Tasks RCU enabled. Jan 28 00:05:33.495338 kernel: Tracing variant of Tasks RCU enabled. Jan 28 00:05:33.495345 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 28 00:05:33.495352 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Jan 28 00:05:33.495359 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 28 00:05:33.495366 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jan 28 00:05:33.495373 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jan 28 00:05:33.495380 kernel: GICv3: 256 SPIs implemented Jan 28 00:05:33.495387 kernel: GICv3: 0 Extended SPIs implemented Jan 28 00:05:33.495394 kernel: Root IRQ handler: gic_handle_irq Jan 28 00:05:33.495400 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jan 28 00:05:33.495407 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jan 28 00:05:33.495414 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jan 28 00:05:33.495421 kernel: ITS [mem 0x08080000-0x0809ffff] Jan 28 00:05:33.495428 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Jan 28 00:05:33.495435 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Jan 28 00:05:33.495442 kernel: GICv3: using LPI property table @0x0000000100130000 Jan 28 00:05:33.495449 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Jan 28 00:05:33.495456 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 28 00:05:33.495464 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 28 00:05:33.495470 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jan 28 00:05:33.495477 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jan 28 00:05:33.495484 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jan 28 00:05:33.495491 kernel: arm-pv: using stolen time PV Jan 28 00:05:33.495499 kernel: Console: colour dummy device 80x25 Jan 28 00:05:33.495506 kernel: ACPI: Core revision 20240827 Jan 28 00:05:33.495513 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jan 28 00:05:33.495522 kernel: pid_max: default: 32768 minimum: 301 Jan 28 00:05:33.495529 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 28 00:05:33.495537 kernel: landlock: Up and running. Jan 28 00:05:33.495544 kernel: SELinux: Initializing. Jan 28 00:05:33.495551 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 28 00:05:33.495558 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 28 00:05:33.495566 kernel: rcu: Hierarchical SRCU implementation. Jan 28 00:05:33.495573 kernel: rcu: Max phase no-delay instances is 400. Jan 28 00:05:33.495582 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 28 00:05:33.495589 kernel: Remapping and enabling EFI services. Jan 28 00:05:33.495596 kernel: smp: Bringing up secondary CPUs ... Jan 28 00:05:33.495603 kernel: Detected PIPT I-cache on CPU1 Jan 28 00:05:33.495610 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jan 28 00:05:33.495618 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Jan 28 00:05:33.495625 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 28 00:05:33.495633 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jan 28 00:05:33.495640 kernel: Detected PIPT I-cache on CPU2 Jan 28 00:05:33.495652 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Jan 28 00:05:33.495660 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Jan 28 00:05:33.495668 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 28 00:05:33.495675 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Jan 28 00:05:33.495683 kernel: Detected PIPT I-cache on CPU3 Jan 28 00:05:33.495690 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Jan 28 00:05:33.495699 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Jan 28 00:05:33.495707 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 28 00:05:33.495714 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Jan 28 00:05:33.495721 kernel: smp: Brought up 1 node, 4 CPUs Jan 28 00:05:33.495729 kernel: SMP: Total of 4 processors activated. Jan 28 00:05:33.495736 kernel: CPU: All CPU(s) started at EL1 Jan 28 00:05:33.495744 kernel: CPU features: detected: 32-bit EL0 Support Jan 28 00:05:33.495752 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jan 28 00:05:33.495760 kernel: CPU features: detected: Common not Private translations Jan 28 00:05:33.495767 kernel: CPU features: detected: CRC32 instructions Jan 28 00:05:33.495775 kernel: CPU features: detected: Enhanced Virtualization Traps Jan 28 00:05:33.495782 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jan 28 00:05:33.495789 kernel: CPU features: detected: LSE atomic instructions Jan 28 00:05:33.495798 kernel: CPU features: detected: Privileged Access Never Jan 28 00:05:33.495806 kernel: CPU features: detected: RAS Extension Support Jan 28 00:05:33.495813 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jan 28 00:05:33.495821 kernel: alternatives: applying system-wide alternatives Jan 28 00:05:33.495828 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Jan 28 00:05:33.495836 kernel: Memory: 16324368K/16777216K available (11200K kernel code, 2458K rwdata, 9092K rodata, 12480K init, 1038K bss, 430064K reserved, 16384K cma-reserved) Jan 28 00:05:33.495843 kernel: devtmpfs: initialized Jan 28 00:05:33.495852 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 28 00:05:33.495860 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Jan 28 00:05:33.495868 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jan 28 00:05:33.495875 kernel: 0 pages in range for non-PLT usage Jan 28 00:05:33.495882 kernel: 515152 pages in range for PLT usage Jan 28 00:05:33.495889 kernel: pinctrl core: initialized pinctrl subsystem Jan 28 00:05:33.495897 kernel: SMBIOS 3.0.0 present. Jan 28 00:05:33.495904 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Jan 28 00:05:33.495913 kernel: DMI: Memory slots populated: 1/1 Jan 28 00:05:33.495920 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 28 00:05:33.495927 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Jan 28 00:05:33.495935 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jan 28 00:05:33.495943 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jan 28 00:05:33.495950 kernel: audit: initializing netlink subsys (disabled) Jan 28 00:05:33.495958 kernel: audit: type=2000 audit(0.037:1): state=initialized audit_enabled=0 res=1 Jan 28 00:05:33.495966 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 28 00:05:33.495974 kernel: cpuidle: using governor menu Jan 28 00:05:33.495981 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jan 28 00:05:33.495989 kernel: ASID allocator initialised with 32768 entries Jan 28 00:05:33.495996 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 28 00:05:33.496004 kernel: Serial: AMBA PL011 UART driver Jan 28 00:05:33.496011 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 28 00:05:33.496020 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jan 28 00:05:33.496027 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jan 28 00:05:33.496035 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jan 28 00:05:33.496042 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 28 00:05:33.496050 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jan 28 00:05:33.496057 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jan 28 00:05:33.496064 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jan 28 00:05:33.496072 kernel: ACPI: Added _OSI(Module Device) Jan 28 00:05:33.496080 kernel: ACPI: Added _OSI(Processor Device) Jan 28 00:05:33.496088 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 28 00:05:33.496095 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 28 00:05:33.496103 kernel: ACPI: Interpreter enabled Jan 28 00:05:33.496110 kernel: ACPI: Using GIC for interrupt routing Jan 28 00:05:33.496118 kernel: ACPI: MCFG table detected, 1 entries Jan 28 00:05:33.496125 kernel: ACPI: CPU0 has been hot-added Jan 28 00:05:33.496134 kernel: ACPI: CPU1 has been hot-added Jan 28 00:05:33.496141 kernel: ACPI: CPU2 has been hot-added Jan 28 00:05:33.496148 kernel: ACPI: CPU3 has been hot-added Jan 28 00:05:33.496156 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jan 28 00:05:33.496163 kernel: printk: legacy console [ttyAMA0] enabled Jan 28 00:05:33.496171 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 28 00:05:33.496322 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 28 00:05:33.496410 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 28 00:05:33.496489 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 28 00:05:33.496567 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jan 28 00:05:33.496644 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jan 28 00:05:33.496653 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jan 28 00:05:33.496661 kernel: PCI host bridge to bus 0000:00 Jan 28 00:05:33.496746 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jan 28 00:05:33.496818 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jan 28 00:05:33.496888 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jan 28 00:05:33.496958 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 28 00:05:33.497067 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jan 28 00:05:33.497156 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.497274 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Jan 28 00:05:33.497358 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 28 00:05:33.497437 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Jan 28 00:05:33.497515 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Jan 28 00:05:33.497600 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.497684 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Jan 28 00:05:33.497764 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 28 00:05:33.497844 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Jan 28 00:05:33.497931 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.498010 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Jan 28 00:05:33.498117 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 28 00:05:33.498201 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Jan 28 00:05:33.498301 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Jan 28 00:05:33.498390 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.498471 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Jan 28 00:05:33.498552 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 28 00:05:33.498635 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Jan 28 00:05:33.498736 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.498839 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Jan 28 00:05:33.498930 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 28 00:05:33.499021 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Jan 28 00:05:33.499121 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Jan 28 00:05:33.499223 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.499310 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Jan 28 00:05:33.499388 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 28 00:05:33.499465 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Jan 28 00:05:33.499543 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Jan 28 00:05:33.499626 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.499708 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Jan 28 00:05:33.499787 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 28 00:05:33.499870 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.499954 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Jan 28 00:05:33.500033 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 28 00:05:33.500120 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.500202 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Jan 28 00:05:33.500294 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 28 00:05:33.500380 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.500458 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Jan 28 00:05:33.500538 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 28 00:05:33.500621 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.500698 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Jan 28 00:05:33.500784 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 28 00:05:33.500869 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.500947 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Jan 28 00:05:33.501026 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 28 00:05:33.501111 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.501190 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Jan 28 00:05:33.501294 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 28 00:05:33.501381 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.501458 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Jan 28 00:05:33.501540 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 28 00:05:33.501624 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.501702 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Jan 28 00:05:33.501781 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 28 00:05:33.501866 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.501946 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Jan 28 00:05:33.502023 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 28 00:05:33.502128 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.502224 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Jan 28 00:05:33.502316 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 28 00:05:33.502401 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.502484 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Jan 28 00:05:33.502563 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 28 00:05:33.502641 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Jan 28 00:05:33.502719 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Jan 28 00:05:33.502802 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.502880 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Jan 28 00:05:33.502960 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 28 00:05:33.503037 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Jan 28 00:05:33.503115 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Jan 28 00:05:33.503200 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.503308 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Jan 28 00:05:33.503388 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 28 00:05:33.503491 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Jan 28 00:05:33.503591 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Jan 28 00:05:33.503680 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.503761 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Jan 28 00:05:33.503840 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 28 00:05:33.503920 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Jan 28 00:05:33.504003 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Jan 28 00:05:33.504095 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.504175 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Jan 28 00:05:33.504271 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 28 00:05:33.504351 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Jan 28 00:05:33.504429 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Jan 28 00:05:33.504516 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.504594 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Jan 28 00:05:33.504672 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 28 00:05:33.504751 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Jan 28 00:05:33.504831 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Jan 28 00:05:33.504929 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.505010 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Jan 28 00:05:33.505088 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 28 00:05:33.505164 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Jan 28 00:05:33.505252 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Jan 28 00:05:33.505340 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.505419 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Jan 28 00:05:33.505498 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 28 00:05:33.505575 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Jan 28 00:05:33.505652 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Jan 28 00:05:33.505734 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.505813 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Jan 28 00:05:33.505891 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 28 00:05:33.505971 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Jan 28 00:05:33.506068 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Jan 28 00:05:33.506160 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.506279 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Jan 28 00:05:33.506361 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 28 00:05:33.506437 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Jan 28 00:05:33.506518 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Jan 28 00:05:33.506602 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.506680 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Jan 28 00:05:33.506757 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 28 00:05:33.506834 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Jan 28 00:05:33.506910 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Jan 28 00:05:33.506996 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.507074 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Jan 28 00:05:33.507151 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 28 00:05:33.507241 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Jan 28 00:05:33.507322 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Jan 28 00:05:33.507407 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.507486 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Jan 28 00:05:33.507565 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 28 00:05:33.507644 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Jan 28 00:05:33.507727 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Jan 28 00:05:33.507819 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.507898 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Jan 28 00:05:33.507976 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 28 00:05:33.508053 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Jan 28 00:05:33.508129 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Jan 28 00:05:33.508224 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.508323 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Jan 28 00:05:33.508402 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 28 00:05:33.508481 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Jan 28 00:05:33.508562 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Jan 28 00:05:33.508649 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 28 00:05:33.508733 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Jan 28 00:05:33.508812 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 28 00:05:33.508891 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Jan 28 00:05:33.508969 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Jan 28 00:05:33.509057 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 28 00:05:33.509138 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Jan 28 00:05:33.510308 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jan 28 00:05:33.510428 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 28 00:05:33.510523 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jan 28 00:05:33.510607 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Jan 28 00:05:33.510697 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Jan 28 00:05:33.510786 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Jan 28 00:05:33.510867 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Jan 28 00:05:33.510955 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 28 00:05:33.511037 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Jan 28 00:05:33.511126 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 28 00:05:33.511244 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Jan 28 00:05:33.511338 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Jan 28 00:05:33.511430 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Jan 28 00:05:33.511512 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Jan 28 00:05:33.511592 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Jan 28 00:05:33.511672 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jan 28 00:05:33.511754 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jan 28 00:05:33.511833 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jan 28 00:05:33.511915 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jan 28 00:05:33.511993 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jan 28 00:05:33.512073 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jan 28 00:05:33.512153 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 28 00:05:33.512257 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jan 28 00:05:33.512339 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jan 28 00:05:33.512420 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 28 00:05:33.512500 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jan 28 00:05:33.512578 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jan 28 00:05:33.512660 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 28 00:05:33.512738 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jan 28 00:05:33.512815 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Jan 28 00:05:33.512896 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 28 00:05:33.512976 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jan 28 00:05:33.513054 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jan 28 00:05:33.513136 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 28 00:05:33.513227 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Jan 28 00:05:33.513318 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Jan 28 00:05:33.513403 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 28 00:05:33.513483 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jan 28 00:05:33.513562 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jan 28 00:05:33.513642 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 28 00:05:33.513719 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jan 28 00:05:33.513796 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jan 28 00:05:33.513880 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 28 00:05:33.513958 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Jan 28 00:05:33.514036 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Jan 28 00:05:33.514142 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Jan 28 00:05:33.514261 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 28 00:05:33.514346 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Jan 28 00:05:33.514442 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Jan 28 00:05:33.514521 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Jan 28 00:05:33.514599 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Jan 28 00:05:33.514680 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Jan 28 00:05:33.514758 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Jan 28 00:05:33.514835 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Jan 28 00:05:33.514918 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 28 00:05:33.514996 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Jan 28 00:05:33.515074 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Jan 28 00:05:33.515153 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 28 00:05:33.515245 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Jan 28 00:05:33.515346 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Jan 28 00:05:33.515430 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 28 00:05:33.515508 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Jan 28 00:05:33.515585 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Jan 28 00:05:33.515665 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 28 00:05:33.515744 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Jan 28 00:05:33.515832 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Jan 28 00:05:33.515921 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 28 00:05:33.516000 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Jan 28 00:05:33.516078 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Jan 28 00:05:33.516159 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Jan 28 00:05:33.516257 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Jan 28 00:05:33.516338 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Jan 28 00:05:33.516419 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Jan 28 00:05:33.516504 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Jan 28 00:05:33.516583 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Jan 28 00:05:33.516667 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Jan 28 00:05:33.516748 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Jan 28 00:05:33.516826 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Jan 28 00:05:33.516907 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 28 00:05:33.516985 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Jan 28 00:05:33.517063 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Jan 28 00:05:33.517143 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 28 00:05:33.517237 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Jan 28 00:05:33.517318 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Jan 28 00:05:33.517400 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 28 00:05:33.517478 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Jan 28 00:05:33.517557 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Jan 28 00:05:33.517639 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 28 00:05:33.517721 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Jan 28 00:05:33.517800 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Jan 28 00:05:33.517882 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 28 00:05:33.517960 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Jan 28 00:05:33.518038 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Jan 28 00:05:33.518146 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Jan 28 00:05:33.518243 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Jan 28 00:05:33.518328 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Jan 28 00:05:33.518411 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Jan 28 00:05:33.518491 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Jan 28 00:05:33.518572 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Jan 28 00:05:33.518653 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 28 00:05:33.518732 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Jan 28 00:05:33.518811 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Jan 28 00:05:33.518893 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 28 00:05:33.518972 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Jan 28 00:05:33.519052 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Jan 28 00:05:33.519135 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 28 00:05:33.519224 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Jan 28 00:05:33.519307 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Jan 28 00:05:33.519388 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 28 00:05:33.519468 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Jan 28 00:05:33.519551 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Jan 28 00:05:33.519635 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 28 00:05:33.519717 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Jan 28 00:05:33.519800 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Jan 28 00:05:33.519900 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Jan 28 00:05:33.519983 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Jan 28 00:05:33.520063 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Jan 28 00:05:33.520141 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Jan 28 00:05:33.520228 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Jan 28 00:05:33.520308 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Jan 28 00:05:33.520388 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Jan 28 00:05:33.520468 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Jan 28 00:05:33.520549 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Jan 28 00:05:33.520627 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Jan 28 00:05:33.520707 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Jan 28 00:05:33.520785 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Jan 28 00:05:33.520873 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Jan 28 00:05:33.520955 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Jan 28 00:05:33.521038 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Jan 28 00:05:33.521116 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Jan 28 00:05:33.521194 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Jan 28 00:05:33.521287 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Jan 28 00:05:33.521367 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Jan 28 00:05:33.521446 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Jan 28 00:05:33.521543 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Jan 28 00:05:33.521621 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Jan 28 00:05:33.521701 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Jan 28 00:05:33.521779 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Jan 28 00:05:33.521858 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Jan 28 00:05:33.521936 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Jan 28 00:05:33.522017 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Jan 28 00:05:33.522117 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Jan 28 00:05:33.522202 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Jan 28 00:05:33.522302 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Jan 28 00:05:33.522385 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Jan 28 00:05:33.522465 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Jan 28 00:05:33.522545 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Jan 28 00:05:33.522627 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Jan 28 00:05:33.522708 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Jan 28 00:05:33.522790 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Jan 28 00:05:33.522871 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Jan 28 00:05:33.522951 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Jan 28 00:05:33.523031 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Jan 28 00:05:33.523111 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Jan 28 00:05:33.523192 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Jan 28 00:05:33.523290 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Jan 28 00:05:33.523372 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Jan 28 00:05:33.523451 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Jan 28 00:05:33.523532 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Jan 28 00:05:33.523612 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Jan 28 00:05:33.523696 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Jan 28 00:05:33.523778 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Jan 28 00:05:33.523861 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Jan 28 00:05:33.523960 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Jan 28 00:05:33.524042 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Jan 28 00:05:33.524122 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Jan 28 00:05:33.524216 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Jan 28 00:05:33.524303 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Jan 28 00:05:33.524384 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Jan 28 00:05:33.524463 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Jan 28 00:05:33.524542 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Jan 28 00:05:33.524621 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Jan 28 00:05:33.524702 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Jan 28 00:05:33.524787 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Jan 28 00:05:33.524871 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Jan 28 00:05:33.524952 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Jan 28 00:05:33.525032 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Jan 28 00:05:33.525110 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Jan 28 00:05:33.525188 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Jan 28 00:05:33.525279 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Jan 28 00:05:33.525378 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Jan 28 00:05:33.525456 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Jan 28 00:05:33.525535 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Jan 28 00:05:33.525614 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Jan 28 00:05:33.525692 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Jan 28 00:05:33.525772 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Jan 28 00:05:33.525851 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Jan 28 00:05:33.525928 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Jan 28 00:05:33.526007 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Jan 28 00:05:33.526110 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Jan 28 00:05:33.526195 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Jan 28 00:05:33.526313 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Jan 28 00:05:33.526412 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Jan 28 00:05:33.526491 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Jan 28 00:05:33.526571 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Jan 28 00:05:33.526649 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Jan 28 00:05:33.526727 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Jan 28 00:05:33.526805 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Jan 28 00:05:33.526887 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Jan 28 00:05:33.526964 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Jan 28 00:05:33.527042 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Jan 28 00:05:33.527119 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Jan 28 00:05:33.527197 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Jan 28 00:05:33.527301 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Jan 28 00:05:33.527382 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Jan 28 00:05:33.527465 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Jan 28 00:05:33.527546 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Jan 28 00:05:33.527627 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Jan 28 00:05:33.527711 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Jan 28 00:05:33.527794 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Jan 28 00:05:33.527876 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Jan 28 00:05:33.527956 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.528035 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.528173 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Jan 28 00:05:33.528269 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.528384 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.528468 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Jan 28 00:05:33.528546 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.528624 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.528703 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Jan 28 00:05:33.528781 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.528861 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.528941 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Jan 28 00:05:33.529018 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.529097 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.529176 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Jan 28 00:05:33.529270 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.529350 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.529434 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Jan 28 00:05:33.529513 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.529592 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.529673 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Jan 28 00:05:33.529752 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.529834 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.529921 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Jan 28 00:05:33.530006 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.530109 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.530198 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Jan 28 00:05:33.530338 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.530423 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.530512 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Jan 28 00:05:33.530594 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.530671 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.530750 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Jan 28 00:05:33.530829 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.530907 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.530986 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Jan 28 00:05:33.531072 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.531158 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.531255 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Jan 28 00:05:33.531343 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.531422 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.531503 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Jan 28 00:05:33.531585 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.531662 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.531761 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Jan 28 00:05:33.531840 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.531918 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.531998 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Jan 28 00:05:33.532076 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.532155 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.532244 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Jan 28 00:05:33.532325 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.532403 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.532482 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Jan 28 00:05:33.532561 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Jan 28 00:05:33.532644 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Jan 28 00:05:33.532723 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Jan 28 00:05:33.532802 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Jan 28 00:05:33.532883 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Jan 28 00:05:33.533000 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Jan 28 00:05:33.533086 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Jan 28 00:05:33.533165 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Jan 28 00:05:33.533262 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Jan 28 00:05:33.533373 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Jan 28 00:05:33.533459 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Jan 28 00:05:33.533547 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Jan 28 00:05:33.533631 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Jan 28 00:05:33.533711 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Jan 28 00:05:33.533791 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.533870 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.533951 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.534031 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.534130 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.534226 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.534326 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.534414 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.534495 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.534598 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.534680 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.534759 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.534839 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.534919 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.535003 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.535088 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.535172 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.535268 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.535359 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.535440 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.535524 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.535605 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.535684 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.535762 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.535842 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.535920 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.536000 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.536087 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.536167 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.536261 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.536342 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.536421 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.536503 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.536582 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.536662 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Jan 28 00:05:33.536740 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Jan 28 00:05:33.536826 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Jan 28 00:05:33.536906 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jan 28 00:05:33.536990 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Jan 28 00:05:33.537069 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 28 00:05:33.537156 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Jan 28 00:05:33.537251 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jan 28 00:05:33.537340 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Jan 28 00:05:33.537419 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Jan 28 00:05:33.537499 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Jan 28 00:05:33.537577 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jan 28 00:05:33.537661 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Jan 28 00:05:33.537741 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Jan 28 00:05:33.537818 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Jan 28 00:05:33.537895 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Jan 28 00:05:33.537973 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jan 28 00:05:33.538074 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Jan 28 00:05:33.538159 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Jan 28 00:05:33.538253 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Jan 28 00:05:33.538335 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jan 28 00:05:33.538419 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Jan 28 00:05:33.538502 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Jan 28 00:05:33.538580 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Jan 28 00:05:33.538658 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Jan 28 00:05:33.538736 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jan 28 00:05:33.538832 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Jan 28 00:05:33.538919 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Jan 28 00:05:33.539001 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Jan 28 00:05:33.539079 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Jan 28 00:05:33.539158 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 28 00:05:33.539249 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Jan 28 00:05:33.539330 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Jan 28 00:05:33.539410 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 28 00:05:33.539492 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Jan 28 00:05:33.539571 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Jan 28 00:05:33.539649 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 28 00:05:33.539730 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Jan 28 00:05:33.539809 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Jan 28 00:05:33.539890 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jan 28 00:05:33.539971 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Jan 28 00:05:33.540051 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Jan 28 00:05:33.540129 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Jan 28 00:05:33.540216 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Jan 28 00:05:33.540300 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Jan 28 00:05:33.540381 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Jan 28 00:05:33.540464 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Jan 28 00:05:33.540546 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Jan 28 00:05:33.540628 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Jan 28 00:05:33.540730 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Jan 28 00:05:33.540809 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Jan 28 00:05:33.540892 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Jan 28 00:05:33.540971 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Jan 28 00:05:33.541050 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Jan 28 00:05:33.541127 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 28 00:05:33.541213 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Jan 28 00:05:33.541298 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Jan 28 00:05:33.541377 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 28 00:05:33.541457 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Jan 28 00:05:33.541536 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Jan 28 00:05:33.541622 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 28 00:05:33.541708 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Jan 28 00:05:33.541791 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Jan 28 00:05:33.541868 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Jan 28 00:05:33.541947 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Jan 28 00:05:33.542026 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Jan 28 00:05:33.542124 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Jan 28 00:05:33.542245 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Jan 28 00:05:33.542331 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Jan 28 00:05:33.542410 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Jan 28 00:05:33.542490 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Jan 28 00:05:33.542573 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Jan 28 00:05:33.542655 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Jan 28 00:05:33.542737 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Jan 28 00:05:33.542817 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Jan 28 00:05:33.542901 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Jan 28 00:05:33.542984 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Jan 28 00:05:33.543066 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Jan 28 00:05:33.543146 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Jan 28 00:05:33.543241 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Jan 28 00:05:33.543325 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Jan 28 00:05:33.543404 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Jan 28 00:05:33.543483 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 28 00:05:33.543563 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Jan 28 00:05:33.543643 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Jan 28 00:05:33.543721 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Jan 28 00:05:33.543799 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 28 00:05:33.543883 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Jan 28 00:05:33.543965 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Jan 28 00:05:33.544043 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Jan 28 00:05:33.544121 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 28 00:05:33.544201 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Jan 28 00:05:33.544293 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Jan 28 00:05:33.544378 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Jan 28 00:05:33.544482 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Jan 28 00:05:33.544571 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Jan 28 00:05:33.544650 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Jan 28 00:05:33.544728 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Jan 28 00:05:33.544805 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Jan 28 00:05:33.544885 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Jan 28 00:05:33.544966 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Jan 28 00:05:33.545044 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Jan 28 00:05:33.545121 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Jan 28 00:05:33.545201 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Jan 28 00:05:33.545293 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Jan 28 00:05:33.545375 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Jan 28 00:05:33.545452 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Jan 28 00:05:33.545536 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Jan 28 00:05:33.545615 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Jan 28 00:05:33.545693 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Jan 28 00:05:33.545771 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Jan 28 00:05:33.545849 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Jan 28 00:05:33.545928 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Jan 28 00:05:33.546009 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Jan 28 00:05:33.546107 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 28 00:05:33.546191 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Jan 28 00:05:33.546287 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Jan 28 00:05:33.546367 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Jan 28 00:05:33.546447 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 28 00:05:33.546531 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Jan 28 00:05:33.546615 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Jan 28 00:05:33.546703 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Jan 28 00:05:33.546783 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 28 00:05:33.546867 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Jan 28 00:05:33.546947 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Jan 28 00:05:33.547025 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Jan 28 00:05:33.547104 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Jan 28 00:05:33.547187 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jan 28 00:05:33.547277 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jan 28 00:05:33.547351 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jan 28 00:05:33.547437 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jan 28 00:05:33.547511 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jan 28 00:05:33.547602 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jan 28 00:05:33.547677 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jan 28 00:05:33.547758 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jan 28 00:05:33.547833 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jan 28 00:05:33.547914 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jan 28 00:05:33.547987 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jan 28 00:05:33.548069 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jan 28 00:05:33.548142 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jan 28 00:05:33.548236 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jan 28 00:05:33.548321 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 28 00:05:33.548402 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jan 28 00:05:33.548479 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 28 00:05:33.548559 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jan 28 00:05:33.548632 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 28 00:05:33.548712 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jan 28 00:05:33.548786 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jan 28 00:05:33.548866 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Jan 28 00:05:33.548940 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Jan 28 00:05:33.549025 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Jan 28 00:05:33.549099 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Jan 28 00:05:33.549177 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Jan 28 00:05:33.549383 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Jan 28 00:05:33.549477 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Jan 28 00:05:33.549553 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Jan 28 00:05:33.549632 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Jan 28 00:05:33.549705 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Jan 28 00:05:33.549787 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Jan 28 00:05:33.549859 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Jan 28 00:05:33.549939 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Jan 28 00:05:33.550012 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Jan 28 00:05:33.550109 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Jan 28 00:05:33.550187 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Jan 28 00:05:33.550293 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Jan 28 00:05:33.550369 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Jan 28 00:05:33.550449 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Jan 28 00:05:33.550533 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Jan 28 00:05:33.550612 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Jan 28 00:05:33.550699 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Jan 28 00:05:33.550773 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Jan 28 00:05:33.550848 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Jan 28 00:05:33.550929 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Jan 28 00:05:33.551005 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Jan 28 00:05:33.551079 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Jan 28 00:05:33.551161 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Jan 28 00:05:33.551256 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Jan 28 00:05:33.551333 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Jan 28 00:05:33.551415 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Jan 28 00:05:33.551491 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Jan 28 00:05:33.551569 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Jan 28 00:05:33.551649 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Jan 28 00:05:33.551723 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Jan 28 00:05:33.551797 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Jan 28 00:05:33.551880 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Jan 28 00:05:33.551956 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Jan 28 00:05:33.552028 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Jan 28 00:05:33.552108 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Jan 28 00:05:33.552181 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Jan 28 00:05:33.552268 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Jan 28 00:05:33.552386 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 28 00:05:33.552467 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Jan 28 00:05:33.552544 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Jan 28 00:05:33.552629 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Jan 28 00:05:33.552736 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Jan 28 00:05:33.552816 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Jan 28 00:05:33.552900 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Jan 28 00:05:33.553002 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Jan 28 00:05:33.553082 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Jan 28 00:05:33.553162 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Jan 28 00:05:33.553252 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Jan 28 00:05:33.553330 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Jan 28 00:05:33.553414 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Jan 28 00:05:33.553489 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Jan 28 00:05:33.553562 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Jan 28 00:05:33.553642 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Jan 28 00:05:33.553727 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Jan 28 00:05:33.553802 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Jan 28 00:05:33.553904 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Jan 28 00:05:33.553978 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Jan 28 00:05:33.554071 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Jan 28 00:05:33.554083 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jan 28 00:05:33.554091 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jan 28 00:05:33.554100 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jan 28 00:05:33.554111 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jan 28 00:05:33.554119 kernel: iommu: Default domain type: Translated Jan 28 00:05:33.554127 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jan 28 00:05:33.554135 kernel: efivars: Registered efivars operations Jan 28 00:05:33.554143 kernel: vgaarb: loaded Jan 28 00:05:33.554151 kernel: clocksource: Switched to clocksource arch_sys_counter Jan 28 00:05:33.554159 kernel: VFS: Disk quotas dquot_6.6.0 Jan 28 00:05:33.554169 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 28 00:05:33.554176 kernel: pnp: PnP ACPI init Jan 28 00:05:33.554298 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jan 28 00:05:33.554312 kernel: pnp: PnP ACPI: found 1 devices Jan 28 00:05:33.554320 kernel: NET: Registered PF_INET protocol family Jan 28 00:05:33.554328 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 28 00:05:33.554337 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Jan 28 00:05:33.554348 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 28 00:05:33.554356 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 28 00:05:33.554364 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Jan 28 00:05:33.554372 kernel: TCP: Hash tables configured (established 131072 bind 65536) Jan 28 00:05:33.554380 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 28 00:05:33.554389 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Jan 28 00:05:33.554397 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 28 00:05:33.554486 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jan 28 00:05:33.554498 kernel: PCI: CLS 0 bytes, default 64 Jan 28 00:05:33.554506 kernel: kvm [1]: HYP mode not available Jan 28 00:05:33.554514 kernel: Initialise system trusted keyrings Jan 28 00:05:33.554522 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Jan 28 00:05:33.554531 kernel: Key type asymmetric registered Jan 28 00:05:33.554538 kernel: Asymmetric key parser 'x509' registered Jan 28 00:05:33.554548 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jan 28 00:05:33.554556 kernel: io scheduler mq-deadline registered Jan 28 00:05:33.554564 kernel: io scheduler kyber registered Jan 28 00:05:33.554572 kernel: io scheduler bfq registered Jan 28 00:05:33.554581 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jan 28 00:05:33.554680 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Jan 28 00:05:33.554761 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Jan 28 00:05:33.554857 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.554937 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Jan 28 00:05:33.555015 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Jan 28 00:05:33.555093 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.555173 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Jan 28 00:05:33.555269 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Jan 28 00:05:33.555351 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.555431 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Jan 28 00:05:33.555509 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Jan 28 00:05:33.555587 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.555666 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Jan 28 00:05:33.555767 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Jan 28 00:05:33.555878 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.555966 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Jan 28 00:05:33.556046 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Jan 28 00:05:33.556126 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.556218 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Jan 28 00:05:33.556308 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Jan 28 00:05:33.558260 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.558352 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Jan 28 00:05:33.558431 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Jan 28 00:05:33.558510 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.558522 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jan 28 00:05:33.558600 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Jan 28 00:05:33.558679 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Jan 28 00:05:33.558760 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.558840 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Jan 28 00:05:33.558918 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Jan 28 00:05:33.559004 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.559087 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Jan 28 00:05:33.559166 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Jan 28 00:05:33.559264 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.559350 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Jan 28 00:05:33.559433 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Jan 28 00:05:33.559511 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.559590 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Jan 28 00:05:33.559668 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Jan 28 00:05:33.559746 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.559856 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Jan 28 00:05:33.559939 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Jan 28 00:05:33.560029 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.560110 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Jan 28 00:05:33.560187 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Jan 28 00:05:33.560283 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.560369 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Jan 28 00:05:33.560449 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Jan 28 00:05:33.560528 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.560539 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jan 28 00:05:33.560618 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Jan 28 00:05:33.560697 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Jan 28 00:05:33.560779 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.560859 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Jan 28 00:05:33.560938 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Jan 28 00:05:33.561016 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.561097 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Jan 28 00:05:33.561177 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Jan 28 00:05:33.561268 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.561351 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Jan 28 00:05:33.561429 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Jan 28 00:05:33.561506 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.561586 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Jan 28 00:05:33.561665 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Jan 28 00:05:33.561744 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.561827 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Jan 28 00:05:33.561906 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Jan 28 00:05:33.561983 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.562084 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Jan 28 00:05:33.562169 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Jan 28 00:05:33.562272 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.562359 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Jan 28 00:05:33.562449 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Jan 28 00:05:33.562529 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.562540 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jan 28 00:05:33.562620 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Jan 28 00:05:33.562699 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Jan 28 00:05:33.562777 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.562865 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Jan 28 00:05:33.562948 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Jan 28 00:05:33.563026 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.563108 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Jan 28 00:05:33.563187 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Jan 28 00:05:33.563287 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.563374 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Jan 28 00:05:33.563453 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Jan 28 00:05:33.563531 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.563611 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Jan 28 00:05:33.563689 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Jan 28 00:05:33.563767 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.563850 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Jan 28 00:05:33.563928 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Jan 28 00:05:33.564006 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.564086 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Jan 28 00:05:33.564165 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Jan 28 00:05:33.564253 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.564334 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Jan 28 00:05:33.564417 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Jan 28 00:05:33.564495 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.564575 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Jan 28 00:05:33.564653 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Jan 28 00:05:33.564732 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 28 00:05:33.564743 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jan 28 00:05:33.564753 kernel: ACPI: button: Power Button [PWRB] Jan 28 00:05:33.564836 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Jan 28 00:05:33.564921 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jan 28 00:05:33.564932 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 28 00:05:33.564940 kernel: thunder_xcv, ver 1.0 Jan 28 00:05:33.564948 kernel: thunder_bgx, ver 1.0 Jan 28 00:05:33.564956 kernel: nicpf, ver 1.0 Jan 28 00:05:33.564965 kernel: nicvf, ver 1.0 Jan 28 00:05:33.565061 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jan 28 00:05:33.565137 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-01-28T00:05:32 UTC (1769558732) Jan 28 00:05:33.565147 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 28 00:05:33.565155 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jan 28 00:05:33.565163 kernel: watchdog: NMI not fully supported Jan 28 00:05:33.565173 kernel: watchdog: Hard watchdog permanently disabled Jan 28 00:05:33.565181 kernel: NET: Registered PF_INET6 protocol family Jan 28 00:05:33.565189 kernel: Segment Routing with IPv6 Jan 28 00:05:33.565197 kernel: In-situ OAM (IOAM) with IPv6 Jan 28 00:05:33.565216 kernel: NET: Registered PF_PACKET protocol family Jan 28 00:05:33.565224 kernel: Key type dns_resolver registered Jan 28 00:05:33.565232 kernel: registered taskstats version 1 Jan 28 00:05:33.565240 kernel: Loading compiled-in X.509 certificates Jan 28 00:05:33.565250 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.66-flatcar: 9b9d0a6e8555c4a74bcb93286e875e2244e1db21' Jan 28 00:05:33.565258 kernel: Demotion targets for Node 0: null Jan 28 00:05:33.565266 kernel: Key type .fscrypt registered Jan 28 00:05:33.565274 kernel: Key type fscrypt-provisioning registered Jan 28 00:05:33.565282 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 28 00:05:33.565290 kernel: ima: Allocated hash algorithm: sha1 Jan 28 00:05:33.565298 kernel: ima: No architecture policies found Jan 28 00:05:33.565308 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jan 28 00:05:33.565316 kernel: clk: Disabling unused clocks Jan 28 00:05:33.565324 kernel: PM: genpd: Disabling unused power domains Jan 28 00:05:33.565332 kernel: Freeing unused kernel memory: 12480K Jan 28 00:05:33.565340 kernel: Run /init as init process Jan 28 00:05:33.565348 kernel: with arguments: Jan 28 00:05:33.565357 kernel: /init Jan 28 00:05:33.565366 kernel: with environment: Jan 28 00:05:33.565373 kernel: HOME=/ Jan 28 00:05:33.565381 kernel: TERM=linux Jan 28 00:05:33.565389 kernel: ACPI: bus type USB registered Jan 28 00:05:33.565397 kernel: usbcore: registered new interface driver usbfs Jan 28 00:05:33.565405 kernel: usbcore: registered new interface driver hub Jan 28 00:05:33.565413 kernel: usbcore: registered new device driver usb Jan 28 00:05:33.565504 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 28 00:05:33.565587 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 28 00:05:33.565668 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 28 00:05:33.565748 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 28 00:05:33.565827 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 28 00:05:33.565907 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 28 00:05:33.566013 kernel: hub 1-0:1.0: USB hub found Jan 28 00:05:33.566130 kernel: hub 1-0:1.0: 4 ports detected Jan 28 00:05:33.566247 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 28 00:05:33.566346 kernel: hub 2-0:1.0: USB hub found Jan 28 00:05:33.566432 kernel: hub 2-0:1.0: 4 ports detected Jan 28 00:05:33.566525 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Jan 28 00:05:33.566608 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Jan 28 00:05:33.566619 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 28 00:05:33.566628 kernel: GPT:25804799 != 104857599 Jan 28 00:05:33.566636 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 28 00:05:33.566645 kernel: GPT:25804799 != 104857599 Jan 28 00:05:33.566652 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 28 00:05:33.566662 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 28 00:05:33.566671 kernel: SCSI subsystem initialized Jan 28 00:05:33.566679 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 28 00:05:33.566687 kernel: device-mapper: uevent: version 1.0.3 Jan 28 00:05:33.566696 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 28 00:05:33.566704 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jan 28 00:05:33.566714 kernel: raid6: neonx8 gen() 15720 MB/s Jan 28 00:05:33.566723 kernel: raid6: neonx4 gen() 15508 MB/s Jan 28 00:05:33.566731 kernel: raid6: neonx2 gen() 12459 MB/s Jan 28 00:05:33.566739 kernel: raid6: neonx1 gen() 10456 MB/s Jan 28 00:05:33.566747 kernel: raid6: int64x8 gen() 6833 MB/s Jan 28 00:05:33.566756 kernel: raid6: int64x4 gen() 7357 MB/s Jan 28 00:05:33.566764 kernel: raid6: int64x2 gen() 6109 MB/s Jan 28 00:05:33.566772 kernel: raid6: int64x1 gen() 5044 MB/s Jan 28 00:05:33.566782 kernel: raid6: using algorithm neonx8 gen() 15720 MB/s Jan 28 00:05:33.566790 kernel: raid6: .... xor() 12054 MB/s, rmw enabled Jan 28 00:05:33.566799 kernel: raid6: using neon recovery algorithm Jan 28 00:05:33.566807 kernel: xor: measuring software checksum speed Jan 28 00:05:33.566818 kernel: 8regs : 21613 MB/sec Jan 28 00:05:33.566826 kernel: 32regs : 21699 MB/sec Jan 28 00:05:33.566835 kernel: arm64_neon : 24579 MB/sec Jan 28 00:05:33.566844 kernel: xor: using function: arm64_neon (24579 MB/sec) Jan 28 00:05:33.566852 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 28 00:05:33.566949 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 28 00:05:33.566962 kernel: BTRFS: device fsid f7176ebb-63b5-458d-bfa0-a0dcd6bb053d devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (276) Jan 28 00:05:33.566971 kernel: BTRFS info (device dm-0): first mount of filesystem f7176ebb-63b5-458d-bfa0-a0dcd6bb053d Jan 28 00:05:33.566980 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jan 28 00:05:33.566990 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 28 00:05:33.566998 kernel: BTRFS info (device dm-0): enabling free space tree Jan 28 00:05:33.567006 kernel: loop: module loaded Jan 28 00:05:33.567015 kernel: loop0: detected capacity change from 0 to 91832 Jan 28 00:05:33.567023 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 28 00:05:33.567124 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jan 28 00:05:33.567138 systemd[1]: Successfully made /usr/ read-only. Jan 28 00:05:33.567150 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 28 00:05:33.567159 systemd[1]: Detected virtualization kvm. Jan 28 00:05:33.567168 systemd[1]: Detected architecture arm64. Jan 28 00:05:33.567176 systemd[1]: Running in initrd. Jan 28 00:05:33.567184 systemd[1]: No hostname configured, using default hostname. Jan 28 00:05:33.567195 systemd[1]: Hostname set to . Jan 28 00:05:33.567212 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 28 00:05:33.567221 systemd[1]: Queued start job for default target initrd.target. Jan 28 00:05:33.567230 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 28 00:05:33.567239 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 28 00:05:33.567248 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 28 00:05:33.567259 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 28 00:05:33.567268 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 28 00:05:33.567278 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 28 00:05:33.567287 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 28 00:05:33.567296 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 28 00:05:33.567304 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 28 00:05:33.567314 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 28 00:05:33.567323 systemd[1]: Reached target paths.target - Path Units. Jan 28 00:05:33.567332 systemd[1]: Reached target slices.target - Slice Units. Jan 28 00:05:33.567340 systemd[1]: Reached target swap.target - Swaps. Jan 28 00:05:33.567349 systemd[1]: Reached target timers.target - Timer Units. Jan 28 00:05:33.567357 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 28 00:05:33.567366 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 28 00:05:33.567376 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 28 00:05:33.567385 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 28 00:05:33.567393 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 28 00:05:33.567402 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 28 00:05:33.567411 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 28 00:05:33.567419 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 28 00:05:33.567430 systemd[1]: Reached target sockets.target - Socket Units. Jan 28 00:05:33.567439 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 28 00:05:33.567448 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 28 00:05:33.567457 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 28 00:05:33.567466 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 28 00:05:33.567475 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 28 00:05:33.567484 systemd[1]: Starting systemd-fsck-usr.service... Jan 28 00:05:33.567494 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 28 00:05:33.567503 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 28 00:05:33.567512 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 28 00:05:33.567521 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 28 00:05:33.567531 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 28 00:05:33.567540 systemd[1]: Finished systemd-fsck-usr.service. Jan 28 00:05:33.567549 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 28 00:05:33.567580 systemd-journald[419]: Collecting audit messages is enabled. Jan 28 00:05:33.567603 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 28 00:05:33.567611 kernel: Bridge firewalling registered Jan 28 00:05:33.567620 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 28 00:05:33.567629 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 28 00:05:33.567638 kernel: audit: type=1130 audit(1769558733.508:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.567648 kernel: audit: type=1130 audit(1769558733.512:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.567657 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 00:05:33.567667 kernel: audit: type=1130 audit(1769558733.517:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.567676 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 28 00:05:33.567685 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 28 00:05:33.567693 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 28 00:05:33.567704 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 28 00:05:33.567713 kernel: audit: type=1130 audit(1769558733.536:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.567722 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 28 00:05:33.567732 kernel: audit: type=1334 audit(1769558733.538:6): prog-id=6 op=LOAD Jan 28 00:05:33.567741 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 28 00:05:33.567751 kernel: audit: type=1130 audit(1769558733.548:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.567759 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 28 00:05:33.567770 kernel: audit: type=1130 audit(1769558733.555:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.567779 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 28 00:05:33.567788 systemd-journald[419]: Journal started Jan 28 00:05:33.567807 systemd-journald[419]: Runtime Journal (/run/log/journal/14cdf137335d468d8f1790d27f444676) is 8M, max 319.5M, 311.5M free. Jan 28 00:05:33.508000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.512000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.517000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.536000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.538000 audit: BPF prog-id=6 op=LOAD Jan 28 00:05:33.548000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.555000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.504038 systemd-modules-load[420]: Inserted module 'br_netfilter' Jan 28 00:05:33.581464 systemd[1]: Started systemd-journald.service - Journal Service. Jan 28 00:05:33.582000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.585227 kernel: audit: type=1130 audit(1769558733.582:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.585749 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 28 00:05:33.591373 systemd-resolved[437]: Positive Trust Anchors: Jan 28 00:05:33.591389 systemd-resolved[437]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 28 00:05:33.591392 systemd-resolved[437]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 28 00:05:33.594774 dracut-cmdline[449]: dracut-109 Jan 28 00:05:33.591424 systemd-resolved[437]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 28 00:05:33.604469 dracut-cmdline[449]: Using kernel command line parameters: SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=880c7a57ca1a4cf41361128ef304e12abcda0ba85f8697ad932e9820a1865169 Jan 28 00:05:33.600647 systemd-tmpfiles[458]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 28 00:05:33.610000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.608932 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 28 00:05:33.616232 kernel: audit: type=1130 audit(1769558733.610:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.622672 systemd-resolved[437]: Defaulting to hostname 'linux'. Jan 28 00:05:33.623692 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 28 00:05:33.624000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.624784 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 28 00:05:33.675230 kernel: Loading iSCSI transport class v2.0-870. Jan 28 00:05:33.688259 kernel: iscsi: registered transport (tcp) Jan 28 00:05:33.704261 kernel: iscsi: registered transport (qla4xxx) Jan 28 00:05:33.704332 kernel: QLogic iSCSI HBA Driver Jan 28 00:05:33.730115 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 28 00:05:33.751166 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 28 00:05:33.752000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.753743 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 28 00:05:33.796283 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 28 00:05:33.797000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.798192 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 28 00:05:33.799740 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 28 00:05:33.828027 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 28 00:05:33.829000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.829000 audit: BPF prog-id=7 op=LOAD Jan 28 00:05:33.829000 audit: BPF prog-id=8 op=LOAD Jan 28 00:05:33.830475 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 28 00:05:33.859324 systemd-udevd[691]: Using default interface naming scheme 'v257'. Jan 28 00:05:33.868051 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 28 00:05:33.869000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.871435 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 28 00:05:33.888475 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 28 00:05:33.890000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.891000 audit: BPF prog-id=9 op=LOAD Jan 28 00:05:33.892088 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 28 00:05:33.894350 dracut-pre-trigger[769]: rd.md=0: removing MD RAID activation Jan 28 00:05:33.919287 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 28 00:05:33.920000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.921473 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 28 00:05:33.929739 systemd-networkd[800]: lo: Link UP Jan 28 00:05:33.929748 systemd-networkd[800]: lo: Gained carrier Jan 28 00:05:33.931000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:33.930198 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 28 00:05:33.931613 systemd[1]: Reached target network.target - Network. Jan 28 00:05:34.005185 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 28 00:05:34.006000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:34.008844 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 28 00:05:34.077082 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 28 00:05:34.090705 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 28 00:05:34.099352 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 28 00:05:34.107494 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 28 00:05:34.114351 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jan 28 00:05:34.114376 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 28 00:05:34.110102 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 28 00:05:34.118478 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jan 28 00:05:34.132342 disk-uuid[868]: Primary Header is updated. Jan 28 00:05:34.132342 disk-uuid[868]: Secondary Entries is updated. Jan 28 00:05:34.132342 disk-uuid[868]: Secondary Header is updated. Jan 28 00:05:34.133383 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 28 00:05:34.135000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:34.133497 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 00:05:34.135720 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 28 00:05:34.139154 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 28 00:05:34.141080 systemd-networkd[800]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 28 00:05:34.141083 systemd-networkd[800]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 28 00:05:34.141546 systemd-networkd[800]: eth0: Link UP Jan 28 00:05:34.143215 systemd-networkd[800]: eth0: Gained carrier Jan 28 00:05:34.143228 systemd-networkd[800]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 28 00:05:34.167839 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 00:05:34.168000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:34.173245 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jan 28 00:05:34.173530 kernel: usbcore: registered new interface driver usbhid Jan 28 00:05:34.174462 kernel: usbhid: USB HID core driver Jan 28 00:05:34.203346 systemd-networkd[800]: eth0: DHCPv4 address 10.0.1.105/25, gateway 10.0.1.1 acquired from 10.0.1.1 Jan 28 00:05:34.225451 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 28 00:05:34.227000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:34.228299 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 28 00:05:34.230846 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 28 00:05:34.231965 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 28 00:05:34.234791 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 28 00:05:34.273247 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 28 00:05:34.273000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:35.182962 disk-uuid[869]: Warning: The kernel is still using the old partition table. Jan 28 00:05:35.182962 disk-uuid[869]: The new table will be used at the next reboot or after you Jan 28 00:05:35.182962 disk-uuid[869]: run partprobe(8) or kpartx(8) Jan 28 00:05:35.182962 disk-uuid[869]: The operation has completed successfully. Jan 28 00:05:35.188201 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 28 00:05:35.188343 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 28 00:05:35.190000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:35.190000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:35.191092 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 28 00:05:35.227226 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (900) Jan 28 00:05:35.229953 kernel: BTRFS info (device vda6): first mount of filesystem 7e41befc-1b7e-4b8d-988a-42f0dc79ed16 Jan 28 00:05:35.230174 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 28 00:05:35.236680 kernel: BTRFS info (device vda6): turning on async discard Jan 28 00:05:35.236707 kernel: BTRFS info (device vda6): enabling free space tree Jan 28 00:05:35.242234 kernel: BTRFS info (device vda6): last unmount of filesystem 7e41befc-1b7e-4b8d-988a-42f0dc79ed16 Jan 28 00:05:35.242759 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 28 00:05:35.243000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:35.244666 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 28 00:05:35.403889 ignition[919]: Ignition 2.24.0 Jan 28 00:05:35.403905 ignition[919]: Stage: fetch-offline Jan 28 00:05:35.403942 ignition[919]: no configs at "/usr/lib/ignition/base.d" Jan 28 00:05:35.403951 ignition[919]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 28 00:05:35.407279 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 28 00:05:35.410000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:35.404107 ignition[919]: parsed url from cmdline: "" Jan 28 00:05:35.404110 ignition[919]: no config URL provided Jan 28 00:05:35.404729 ignition[919]: reading system config file "/usr/lib/ignition/user.ign" Jan 28 00:05:35.411531 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 28 00:05:35.404739 ignition[919]: no config at "/usr/lib/ignition/user.ign" Jan 28 00:05:35.404744 ignition[919]: failed to fetch config: resource requires networking Jan 28 00:05:35.404894 ignition[919]: Ignition finished successfully Jan 28 00:05:35.433991 ignition[933]: Ignition 2.24.0 Jan 28 00:05:35.434011 ignition[933]: Stage: fetch Jan 28 00:05:35.434164 ignition[933]: no configs at "/usr/lib/ignition/base.d" Jan 28 00:05:35.434173 ignition[933]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 28 00:05:35.434270 ignition[933]: parsed url from cmdline: "" Jan 28 00:05:35.434273 ignition[933]: no config URL provided Jan 28 00:05:35.434277 ignition[933]: reading system config file "/usr/lib/ignition/user.ign" Jan 28 00:05:35.434283 ignition[933]: no config at "/usr/lib/ignition/user.ign" Jan 28 00:05:35.434443 ignition[933]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 28 00:05:35.434461 ignition[933]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 28 00:05:35.434470 ignition[933]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 28 00:05:35.573523 systemd-networkd[800]: eth0: Gained IPv6LL Jan 28 00:05:36.434808 ignition[933]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 28 00:05:36.434925 ignition[933]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 28 00:05:37.391386 ignition[933]: GET result: OK Jan 28 00:05:37.391626 ignition[933]: parsing config with SHA512: 7f1b3009a919b7b2a6dd916bd71febac4c598c5bc5582f86608275ec0ef26b2f689657136bedd6290e969f1926da8c312a3d3c1e20df6a14612dd50ff3adadba Jan 28 00:05:37.396363 unknown[933]: fetched base config from "system" Jan 28 00:05:37.396374 unknown[933]: fetched base config from "system" Jan 28 00:05:37.396683 ignition[933]: fetch: fetch complete Jan 28 00:05:37.396379 unknown[933]: fetched user config from "openstack" Jan 28 00:05:37.396688 ignition[933]: fetch: fetch passed Jan 28 00:05:37.404713 kernel: kauditd_printk_skb: 20 callbacks suppressed Jan 28 00:05:37.404736 kernel: audit: type=1130 audit(1769558737.401:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:37.401000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:37.400454 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 28 00:05:37.396724 ignition[933]: Ignition finished successfully Jan 28 00:05:37.402319 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 28 00:05:37.427023 ignition[941]: Ignition 2.24.0 Jan 28 00:05:37.427040 ignition[941]: Stage: kargs Jan 28 00:05:37.427178 ignition[941]: no configs at "/usr/lib/ignition/base.d" Jan 28 00:05:37.427186 ignition[941]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 28 00:05:37.429535 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 28 00:05:37.434284 kernel: audit: type=1130 audit(1769558737.430:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:37.430000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:37.427913 ignition[941]: kargs: kargs passed Jan 28 00:05:37.432057 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 28 00:05:37.427953 ignition[941]: Ignition finished successfully Jan 28 00:05:37.455432 ignition[948]: Ignition 2.24.0 Jan 28 00:05:37.455445 ignition[948]: Stage: disks Jan 28 00:05:37.455581 ignition[948]: no configs at "/usr/lib/ignition/base.d" Jan 28 00:05:37.455588 ignition[948]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 28 00:05:37.458327 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 28 00:05:37.459000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:37.463245 kernel: audit: type=1130 audit(1769558737.459:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:37.456317 ignition[948]: disks: disks passed Jan 28 00:05:37.460380 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 28 00:05:37.456361 ignition[948]: Ignition finished successfully Jan 28 00:05:37.464614 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 28 00:05:37.466192 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 28 00:05:37.467974 systemd[1]: Reached target sysinit.target - System Initialization. Jan 28 00:05:37.469405 systemd[1]: Reached target basic.target - Basic System. Jan 28 00:05:37.471877 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 28 00:05:37.527044 systemd-fsck[957]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 28 00:05:37.532139 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 28 00:05:37.537294 kernel: audit: type=1130 audit(1769558737.533:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:37.533000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:37.534533 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 28 00:05:37.633261 kernel: EXT4-fs (vda9): mounted filesystem e122e254-04a8-47c4-9c16-e71d001bbc70 r/w with ordered data mode. Quota mode: none. Jan 28 00:05:37.633908 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 28 00:05:37.635157 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 28 00:05:37.638729 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 28 00:05:37.640467 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 28 00:05:37.641412 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 28 00:05:37.642021 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 28 00:05:37.644381 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 28 00:05:37.644413 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 28 00:05:37.653061 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 28 00:05:37.655122 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 28 00:05:37.666220 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (965) Jan 28 00:05:37.669505 kernel: BTRFS info (device vda6): first mount of filesystem 7e41befc-1b7e-4b8d-988a-42f0dc79ed16 Jan 28 00:05:37.669543 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 28 00:05:37.677645 kernel: BTRFS info (device vda6): turning on async discard Jan 28 00:05:37.677750 kernel: BTRFS info (device vda6): enabling free space tree Jan 28 00:05:37.678739 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 28 00:05:37.714234 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 00:05:37.823620 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 28 00:05:37.824000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:37.825886 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 28 00:05:37.829744 kernel: audit: type=1130 audit(1769558737.824:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:37.829590 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 28 00:05:37.845782 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 28 00:05:37.847815 kernel: BTRFS info (device vda6): last unmount of filesystem 7e41befc-1b7e-4b8d-988a-42f0dc79ed16 Jan 28 00:05:37.863839 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 28 00:05:37.869054 kernel: audit: type=1130 audit(1769558737.864:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:37.864000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:37.873164 ignition[1066]: INFO : Ignition 2.24.0 Jan 28 00:05:37.873164 ignition[1066]: INFO : Stage: mount Jan 28 00:05:37.875722 ignition[1066]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 28 00:05:37.875722 ignition[1066]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 28 00:05:37.875722 ignition[1066]: INFO : mount: mount passed Jan 28 00:05:37.875722 ignition[1066]: INFO : Ignition finished successfully Jan 28 00:05:37.882091 kernel: audit: type=1130 audit(1769558737.877:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:37.877000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:37.876425 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 28 00:05:38.750275 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 00:05:40.759244 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 00:05:44.764251 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 00:05:44.773858 coreos-metadata[967]: Jan 28 00:05:44.773 WARN failed to locate config-drive, using the metadata service API instead Jan 28 00:05:44.792582 coreos-metadata[967]: Jan 28 00:05:44.792 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 28 00:05:45.417697 coreos-metadata[967]: Jan 28 00:05:45.417 INFO Fetch successful Jan 28 00:05:45.418946 coreos-metadata[967]: Jan 28 00:05:45.418 INFO wrote hostname ci-4593-0-0-n-ea467cc685 to /sysroot/etc/hostname Jan 28 00:05:45.420745 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 28 00:05:45.420835 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 28 00:05:45.424000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:45.426438 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 28 00:05:45.433193 kernel: audit: type=1130 audit(1769558745.424:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:45.433229 kernel: audit: type=1131 audit(1769558745.424:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:45.424000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:45.444384 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 28 00:05:45.475232 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1083) Jan 28 00:05:45.475268 kernel: BTRFS info (device vda6): first mount of filesystem 7e41befc-1b7e-4b8d-988a-42f0dc79ed16 Jan 28 00:05:45.477451 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jan 28 00:05:45.482745 kernel: BTRFS info (device vda6): turning on async discard Jan 28 00:05:45.482765 kernel: BTRFS info (device vda6): enabling free space tree Jan 28 00:05:45.484076 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 28 00:05:45.511699 ignition[1101]: INFO : Ignition 2.24.0 Jan 28 00:05:45.511699 ignition[1101]: INFO : Stage: files Jan 28 00:05:45.513500 ignition[1101]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 28 00:05:45.513500 ignition[1101]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 28 00:05:45.513500 ignition[1101]: DEBUG : files: compiled without relabeling support, skipping Jan 28 00:05:45.517218 ignition[1101]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 28 00:05:45.517218 ignition[1101]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 28 00:05:45.520176 ignition[1101]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 28 00:05:45.520176 ignition[1101]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 28 00:05:45.522876 ignition[1101]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 28 00:05:45.520384 unknown[1101]: wrote ssh authorized keys file for user: core Jan 28 00:05:45.525789 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 28 00:05:45.525789 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Jan 28 00:05:45.582416 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 28 00:05:45.710014 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 28 00:05:45.710014 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 28 00:05:45.713779 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 28 00:05:45.713779 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 28 00:05:45.713779 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 28 00:05:45.713779 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 28 00:05:45.713779 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 28 00:05:45.713779 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 28 00:05:45.713779 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 28 00:05:45.713779 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 28 00:05:45.713779 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 28 00:05:45.713779 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 28 00:05:45.729585 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 28 00:05:45.729585 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 28 00:05:45.729585 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Jan 28 00:05:45.821830 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 28 00:05:46.392172 ignition[1101]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 28 00:05:46.392172 ignition[1101]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 28 00:05:46.395439 ignition[1101]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 28 00:05:46.398888 ignition[1101]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 28 00:05:46.398888 ignition[1101]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 28 00:05:46.398888 ignition[1101]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 28 00:05:46.406284 kernel: audit: type=1130 audit(1769558746.402:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.402000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.406347 ignition[1101]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 28 00:05:46.406347 ignition[1101]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 28 00:05:46.406347 ignition[1101]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 28 00:05:46.406347 ignition[1101]: INFO : files: files passed Jan 28 00:05:46.406347 ignition[1101]: INFO : Ignition finished successfully Jan 28 00:05:46.401034 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 28 00:05:46.403999 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 28 00:05:46.407813 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 28 00:05:46.416882 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 28 00:05:46.417000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.416976 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 28 00:05:46.424510 kernel: audit: type=1130 audit(1769558746.417:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.424534 kernel: audit: type=1131 audit(1769558746.417:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.417000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.426612 initrd-setup-root-after-ignition[1134]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 28 00:05:46.428121 initrd-setup-root-after-ignition[1138]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 28 00:05:46.429529 initrd-setup-root-after-ignition[1134]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 28 00:05:46.430000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.428846 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 28 00:05:46.436365 kernel: audit: type=1130 audit(1769558746.430:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.430744 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 28 00:05:46.436145 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 28 00:05:46.495405 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 28 00:05:46.495534 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 28 00:05:46.503708 kernel: audit: type=1130 audit(1769558746.497:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.503733 kernel: audit: type=1131 audit(1769558746.497:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.497000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.497000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.497972 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 28 00:05:46.504576 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 28 00:05:46.506302 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 28 00:05:46.507188 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 28 00:05:46.533630 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 28 00:05:46.534000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.535939 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 28 00:05:46.540133 kernel: audit: type=1130 audit(1769558746.534:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.553294 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 28 00:05:46.553489 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 28 00:05:46.555842 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 28 00:05:46.557688 systemd[1]: Stopped target timers.target - Timer Units. Jan 28 00:05:46.559285 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 28 00:05:46.560000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.559403 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 28 00:05:46.565107 kernel: audit: type=1131 audit(1769558746.560:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.564143 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 28 00:05:46.566312 systemd[1]: Stopped target basic.target - Basic System. Jan 28 00:05:46.567913 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 28 00:05:46.569489 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 28 00:05:46.571302 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 28 00:05:46.573185 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 28 00:05:46.575039 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 28 00:05:46.576753 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 28 00:05:46.578500 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 28 00:05:46.580178 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 28 00:05:46.581826 systemd[1]: Stopped target swap.target - Swaps. Jan 28 00:05:46.583187 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 28 00:05:46.584000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.583328 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 28 00:05:46.585458 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 28 00:05:46.587185 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 28 00:05:46.588992 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 28 00:05:46.592269 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 28 00:05:46.593375 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 28 00:05:46.595000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.593495 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 28 00:05:46.596018 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 28 00:05:46.597000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.596133 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 28 00:05:46.599000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.598164 systemd[1]: ignition-files.service: Deactivated successfully. Jan 28 00:05:46.598285 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 28 00:05:46.603000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.600843 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 28 00:05:46.601694 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 28 00:05:46.601830 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 28 00:05:46.607000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.604322 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 28 00:05:46.609000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.605809 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 28 00:05:46.611000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.605936 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 28 00:05:46.607727 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 28 00:05:46.607833 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 28 00:05:46.609573 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 28 00:05:46.616000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.616000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.609680 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 28 00:05:46.615034 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 28 00:05:46.615142 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 28 00:05:46.625675 ignition[1158]: INFO : Ignition 2.24.0 Jan 28 00:05:46.625675 ignition[1158]: INFO : Stage: umount Jan 28 00:05:46.627274 ignition[1158]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 28 00:05:46.627274 ignition[1158]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 28 00:05:46.627274 ignition[1158]: INFO : umount: umount passed Jan 28 00:05:46.627274 ignition[1158]: INFO : Ignition finished successfully Jan 28 00:05:46.630000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.628831 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 28 00:05:46.633000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.629437 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 28 00:05:46.635000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.631194 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 28 00:05:46.636000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.631670 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 28 00:05:46.631705 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 28 00:05:46.640000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.633342 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 28 00:05:46.633386 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 28 00:05:46.635383 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 28 00:05:46.635430 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 28 00:05:46.636949 systemd[1]: Stopped target network.target - Network. Jan 28 00:05:46.639107 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 28 00:05:46.639162 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 28 00:05:46.641005 systemd[1]: Stopped target paths.target - Path Units. Jan 28 00:05:46.642775 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 28 00:05:46.646539 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 28 00:05:46.648170 systemd[1]: Stopped target slices.target - Slice Units. Jan 28 00:05:46.650149 systemd[1]: Stopped target sockets.target - Socket Units. Jan 28 00:05:46.659000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.651940 systemd[1]: iscsid.socket: Deactivated successfully. Jan 28 00:05:46.660000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.651981 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 28 00:05:46.653356 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 28 00:05:46.653386 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 28 00:05:46.655213 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 28 00:05:46.655235 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 28 00:05:46.657521 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 28 00:05:46.657578 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 28 00:05:46.659362 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 28 00:05:46.672000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.659408 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 28 00:05:46.661015 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 28 00:05:46.662443 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 28 00:05:46.671343 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 28 00:05:46.678000 audit: BPF prog-id=6 op=UNLOAD Jan 28 00:05:46.671470 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 28 00:05:46.682981 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 28 00:05:46.684000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.683095 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 28 00:05:46.687620 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 28 00:05:46.688000 audit: BPF prog-id=9 op=UNLOAD Jan 28 00:05:46.688780 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 28 00:05:46.688824 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 28 00:05:46.691289 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 28 00:05:46.694000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.692707 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 28 00:05:46.696000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.692774 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 28 00:05:46.698000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.694690 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 28 00:05:46.694735 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 28 00:05:46.697104 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 28 00:05:46.697148 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 28 00:05:46.698995 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 28 00:05:46.714156 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 28 00:05:46.714275 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 28 00:05:46.715000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.717471 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 28 00:05:46.717559 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 28 00:05:46.719000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.719750 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 28 00:05:46.720437 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 28 00:05:46.722000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.722505 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 28 00:05:46.722541 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 28 00:05:46.724224 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 28 00:05:46.727000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.724255 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 28 00:05:46.725959 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 28 00:05:46.729000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.726018 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 28 00:05:46.728466 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 28 00:05:46.732000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.728508 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 28 00:05:46.730969 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 28 00:05:46.731024 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 28 00:05:46.737000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.734319 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 28 00:05:46.738000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.735298 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 28 00:05:46.741000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.735354 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 28 00:05:46.737218 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 28 00:05:46.737264 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 28 00:05:46.739272 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 28 00:05:46.739317 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 00:05:46.741672 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 28 00:05:46.752477 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 28 00:05:46.753000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.758703 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 28 00:05:46.758823 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 28 00:05:46.762000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.762000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:46.762504 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 28 00:05:46.764981 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 28 00:05:46.792058 systemd[1]: Switching root. Jan 28 00:05:46.843016 systemd-journald[419]: Journal stopped Jan 28 00:05:47.809035 systemd-journald[419]: Received SIGTERM from PID 1 (systemd). Jan 28 00:05:47.809128 kernel: SELinux: policy capability network_peer_controls=1 Jan 28 00:05:47.809148 kernel: SELinux: policy capability open_perms=1 Jan 28 00:05:47.809161 kernel: SELinux: policy capability extended_socket_class=1 Jan 28 00:05:47.809178 kernel: SELinux: policy capability always_check_network=0 Jan 28 00:05:47.809188 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 28 00:05:47.809201 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 28 00:05:47.809247 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 28 00:05:47.809263 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 28 00:05:47.809275 kernel: SELinux: policy capability userspace_initial_context=0 Jan 28 00:05:47.809288 systemd[1]: Successfully loaded SELinux policy in 64.534ms. Jan 28 00:05:47.809308 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.574ms. Jan 28 00:05:47.809320 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 28 00:05:47.809332 systemd[1]: Detected virtualization kvm. Jan 28 00:05:47.809346 systemd[1]: Detected architecture arm64. Jan 28 00:05:47.809356 systemd[1]: Detected first boot. Jan 28 00:05:47.809366 systemd[1]: Hostname set to . Jan 28 00:05:47.809379 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 28 00:05:47.809393 zram_generator::config[1205]: No configuration found. Jan 28 00:05:47.809407 kernel: NET: Registered PF_VSOCK protocol family Jan 28 00:05:47.809418 systemd[1]: Populated /etc with preset unit settings. Jan 28 00:05:47.809429 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 28 00:05:47.809439 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 28 00:05:47.809450 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 28 00:05:47.809463 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 28 00:05:47.809475 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 28 00:05:47.809487 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 28 00:05:47.809497 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 28 00:05:47.809508 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 28 00:05:47.809519 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 28 00:05:47.809529 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 28 00:05:47.809542 systemd[1]: Created slice user.slice - User and Session Slice. Jan 28 00:05:47.809552 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 28 00:05:47.809564 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 28 00:05:47.809575 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 28 00:05:47.809586 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 28 00:05:47.809602 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 28 00:05:47.809616 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 28 00:05:47.809628 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jan 28 00:05:47.809638 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 28 00:05:47.809649 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 28 00:05:47.809660 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 28 00:05:47.809670 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 28 00:05:47.809682 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 28 00:05:47.809693 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 28 00:05:47.809704 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 28 00:05:47.809714 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 28 00:05:47.809725 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 28 00:05:47.809736 systemd[1]: Reached target slices.target - Slice Units. Jan 28 00:05:47.809747 systemd[1]: Reached target swap.target - Swaps. Jan 28 00:05:47.809759 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 28 00:05:47.809769 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 28 00:05:47.809780 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 28 00:05:47.809790 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 28 00:05:47.809801 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 28 00:05:47.809812 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 28 00:05:47.809822 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 28 00:05:47.809834 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 28 00:05:47.809845 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 28 00:05:47.809855 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 28 00:05:47.809866 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 28 00:05:47.809876 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 28 00:05:47.809887 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 28 00:05:47.809898 systemd[1]: Mounting media.mount - External Media Directory... Jan 28 00:05:47.809910 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 28 00:05:47.809924 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 28 00:05:47.809935 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 28 00:05:47.809946 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 28 00:05:47.809957 systemd[1]: Reached target machines.target - Containers. Jan 28 00:05:47.809967 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 28 00:05:47.809980 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 28 00:05:47.809992 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 28 00:05:47.810014 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 28 00:05:47.810027 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 28 00:05:47.810158 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 28 00:05:47.810174 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 28 00:05:47.810186 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 28 00:05:47.810196 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 28 00:05:47.810220 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 28 00:05:47.810234 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 28 00:05:47.810249 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 28 00:05:47.810264 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 28 00:05:47.810275 systemd[1]: Stopped systemd-fsck-usr.service. Jan 28 00:05:47.810287 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 28 00:05:47.810298 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 28 00:05:47.810308 kernel: fuse: init (API version 7.41) Jan 28 00:05:47.810319 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 28 00:05:47.810331 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 28 00:05:47.810343 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 28 00:05:47.810356 kernel: ACPI: bus type drm_connector registered Jan 28 00:05:47.810366 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 28 00:05:47.810377 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 28 00:05:47.810388 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 28 00:05:47.810398 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 28 00:05:47.810409 systemd[1]: Mounted media.mount - External Media Directory. Jan 28 00:05:47.810421 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 28 00:05:47.810432 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 28 00:05:47.810443 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 28 00:05:47.810453 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 28 00:05:47.810487 systemd-journald[1269]: Collecting audit messages is enabled. Jan 28 00:05:47.812888 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 28 00:05:47.812916 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 28 00:05:47.812928 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 28 00:05:47.812940 systemd-journald[1269]: Journal started Jan 28 00:05:47.812968 systemd-journald[1269]: Runtime Journal (/run/log/journal/14cdf137335d468d8f1790d27f444676) is 8M, max 319.5M, 311.5M free. Jan 28 00:05:47.660000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Jan 28 00:05:47.751000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.753000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.756000 audit: BPF prog-id=14 op=UNLOAD Jan 28 00:05:47.756000 audit: BPF prog-id=13 op=UNLOAD Jan 28 00:05:47.757000 audit: BPF prog-id=15 op=LOAD Jan 28 00:05:47.757000 audit: BPF prog-id=16 op=LOAD Jan 28 00:05:47.757000 audit: BPF prog-id=17 op=LOAD Jan 28 00:05:47.806000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 28 00:05:47.806000 audit[1269]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=6 a1=ffffc45ee1c0 a2=4000 a3=0 items=0 ppid=1 pid=1269 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:05:47.806000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 28 00:05:47.808000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.811000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.811000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.566293 systemd[1]: Queued start job for default target multi-user.target. Jan 28 00:05:47.589509 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 28 00:05:47.589901 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 28 00:05:47.814036 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 28 00:05:47.814000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.815000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.817091 systemd[1]: Started systemd-journald.service - Journal Service. Jan 28 00:05:47.817000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.818095 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 28 00:05:47.818265 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 28 00:05:47.819000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.819000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.819565 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 28 00:05:47.819706 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 28 00:05:47.822000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.822000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.822515 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 28 00:05:47.822668 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 28 00:05:47.823000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.823000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.823880 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 28 00:05:47.824015 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 28 00:05:47.824000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.824000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.825436 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 28 00:05:47.826000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.826798 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 28 00:05:47.827000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.828323 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 28 00:05:47.829000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.830250 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 28 00:05:47.831000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.831886 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 28 00:05:47.832000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.844170 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 28 00:05:47.846169 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 28 00:05:47.848427 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 28 00:05:47.850249 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 28 00:05:47.851248 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 28 00:05:47.851283 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 28 00:05:47.853058 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 28 00:05:47.854827 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 28 00:05:47.854938 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 28 00:05:47.860364 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 28 00:05:47.862290 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 28 00:05:47.863320 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 28 00:05:47.864292 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 28 00:05:47.865457 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 28 00:05:47.866474 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 28 00:05:47.870662 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 28 00:05:47.875497 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 28 00:05:47.877836 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 28 00:05:47.879245 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 28 00:05:47.883521 systemd-journald[1269]: Time spent on flushing to /var/log/journal/14cdf137335d468d8f1790d27f444676 is 25.338ms for 1816 entries. Jan 28 00:05:47.883521 systemd-journald[1269]: System Journal (/var/log/journal/14cdf137335d468d8f1790d27f444676) is 8M, max 588.1M, 580.1M free. Jan 28 00:05:47.943849 systemd-journald[1269]: Received client request to flush runtime journal. Jan 28 00:05:47.943913 kernel: loop1: detected capacity change from 0 to 211168 Jan 28 00:05:47.890000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.900000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.907000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.887764 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 28 00:05:47.890649 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 28 00:05:47.893626 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 28 00:05:47.899459 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 28 00:05:47.906614 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 28 00:05:47.943960 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 28 00:05:47.945000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.946229 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 28 00:05:47.947000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.949000 audit: BPF prog-id=18 op=LOAD Jan 28 00:05:47.949000 audit: BPF prog-id=19 op=LOAD Jan 28 00:05:47.950000 audit: BPF prog-id=20 op=LOAD Jan 28 00:05:47.950921 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 28 00:05:47.953000 audit: BPF prog-id=21 op=LOAD Jan 28 00:05:47.954083 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 28 00:05:47.957342 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 28 00:05:47.959000 audit: BPF prog-id=22 op=LOAD Jan 28 00:05:47.959000 audit: BPF prog-id=23 op=LOAD Jan 28 00:05:47.959000 audit: BPF prog-id=24 op=LOAD Jan 28 00:05:47.960684 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 28 00:05:47.963226 kernel: loop2: detected capacity change from 0 to 45344 Jan 28 00:05:47.963000 audit: BPF prog-id=25 op=LOAD Jan 28 00:05:47.963000 audit: BPF prog-id=26 op=LOAD Jan 28 00:05:47.963000 audit: BPF prog-id=27 op=LOAD Jan 28 00:05:47.964797 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 28 00:05:47.973304 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 28 00:05:47.974000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:47.999115 systemd-tmpfiles[1344]: ACLs are not supported, ignoring. Jan 28 00:05:47.999135 systemd-tmpfiles[1344]: ACLs are not supported, ignoring. Jan 28 00:05:48.000178 systemd-nsresourced[1345]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 28 00:05:48.001276 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 28 00:05:48.002000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.006291 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 28 00:05:48.007000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.008901 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 28 00:05:48.011000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.024242 kernel: loop3: detected capacity change from 0 to 100192 Jan 28 00:05:48.052585 systemd-oomd[1342]: No swap; memory pressure usage will be degraded Jan 28 00:05:48.053371 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 28 00:05:48.054000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.069055 systemd-resolved[1343]: Positive Trust Anchors: Jan 28 00:05:48.069074 systemd-resolved[1343]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 28 00:05:48.069077 systemd-resolved[1343]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 28 00:05:48.069108 systemd-resolved[1343]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 28 00:05:48.083043 systemd-resolved[1343]: Using system hostname 'ci-4593-0-0-n-ea467cc685'. Jan 28 00:05:48.084227 kernel: loop4: detected capacity change from 0 to 1648 Jan 28 00:05:48.084354 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 28 00:05:48.085000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.086039 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 28 00:05:48.115248 kernel: loop5: detected capacity change from 0 to 211168 Jan 28 00:05:48.135235 kernel: loop6: detected capacity change from 0 to 45344 Jan 28 00:05:48.156226 kernel: loop7: detected capacity change from 0 to 100192 Jan 28 00:05:48.177228 kernel: loop1: detected capacity change from 0 to 1648 Jan 28 00:05:48.184819 (sd-merge)[1369]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Jan 28 00:05:48.187961 (sd-merge)[1369]: Merged extensions into '/usr'. Jan 28 00:05:48.191935 systemd[1]: Reload requested from client PID 1326 ('systemd-sysext') (unit systemd-sysext.service)... Jan 28 00:05:48.191961 systemd[1]: Reloading... Jan 28 00:05:48.246229 zram_generator::config[1399]: No configuration found. Jan 28 00:05:48.395619 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 28 00:05:48.395817 systemd[1]: Reloading finished in 203 ms. Jan 28 00:05:48.426277 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 28 00:05:48.427000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.427657 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 28 00:05:48.428000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.451883 systemd[1]: Starting ensure-sysext.service... Jan 28 00:05:48.453660 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 28 00:05:48.454000 audit: BPF prog-id=8 op=UNLOAD Jan 28 00:05:48.454000 audit: BPF prog-id=7 op=UNLOAD Jan 28 00:05:48.455000 audit: BPF prog-id=28 op=LOAD Jan 28 00:05:48.455000 audit: BPF prog-id=29 op=LOAD Jan 28 00:05:48.456013 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 28 00:05:48.457000 audit: BPF prog-id=30 op=LOAD Jan 28 00:05:48.457000 audit: BPF prog-id=21 op=UNLOAD Jan 28 00:05:48.459000 audit: BPF prog-id=31 op=LOAD Jan 28 00:05:48.459000 audit: BPF prog-id=25 op=UNLOAD Jan 28 00:05:48.459000 audit: BPF prog-id=32 op=LOAD Jan 28 00:05:48.459000 audit: BPF prog-id=33 op=LOAD Jan 28 00:05:48.459000 audit: BPF prog-id=26 op=UNLOAD Jan 28 00:05:48.459000 audit: BPF prog-id=27 op=UNLOAD Jan 28 00:05:48.459000 audit: BPF prog-id=34 op=LOAD Jan 28 00:05:48.459000 audit: BPF prog-id=18 op=UNLOAD Jan 28 00:05:48.459000 audit: BPF prog-id=35 op=LOAD Jan 28 00:05:48.460000 audit: BPF prog-id=36 op=LOAD Jan 28 00:05:48.460000 audit: BPF prog-id=19 op=UNLOAD Jan 28 00:05:48.460000 audit: BPF prog-id=20 op=UNLOAD Jan 28 00:05:48.460000 audit: BPF prog-id=37 op=LOAD Jan 28 00:05:48.460000 audit: BPF prog-id=22 op=UNLOAD Jan 28 00:05:48.460000 audit: BPF prog-id=38 op=LOAD Jan 28 00:05:48.460000 audit: BPF prog-id=39 op=LOAD Jan 28 00:05:48.460000 audit: BPF prog-id=23 op=UNLOAD Jan 28 00:05:48.460000 audit: BPF prog-id=24 op=UNLOAD Jan 28 00:05:48.461000 audit: BPF prog-id=40 op=LOAD Jan 28 00:05:48.461000 audit: BPF prog-id=15 op=UNLOAD Jan 28 00:05:48.461000 audit: BPF prog-id=41 op=LOAD Jan 28 00:05:48.461000 audit: BPF prog-id=42 op=LOAD Jan 28 00:05:48.461000 audit: BPF prog-id=16 op=UNLOAD Jan 28 00:05:48.461000 audit: BPF prog-id=17 op=UNLOAD Jan 28 00:05:48.465340 systemd[1]: Reload requested from client PID 1438 ('systemctl') (unit ensure-sysext.service)... Jan 28 00:05:48.465357 systemd[1]: Reloading... Jan 28 00:05:48.467702 systemd-tmpfiles[1439]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 28 00:05:48.467742 systemd-tmpfiles[1439]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 28 00:05:48.467957 systemd-tmpfiles[1439]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 28 00:05:48.468859 systemd-tmpfiles[1439]: ACLs are not supported, ignoring. Jan 28 00:05:48.468912 systemd-tmpfiles[1439]: ACLs are not supported, ignoring. Jan 28 00:05:48.477517 systemd-tmpfiles[1439]: Detected autofs mount point /boot during canonicalization of boot. Jan 28 00:05:48.477533 systemd-tmpfiles[1439]: Skipping /boot Jan 28 00:05:48.478981 systemd-udevd[1440]: Using default interface naming scheme 'v257'. Jan 28 00:05:48.483512 systemd-tmpfiles[1439]: Detected autofs mount point /boot during canonicalization of boot. Jan 28 00:05:48.483528 systemd-tmpfiles[1439]: Skipping /boot Jan 28 00:05:48.514235 zram_generator::config[1470]: No configuration found. Jan 28 00:05:48.616242 kernel: mousedev: PS/2 mouse device common for all mice Jan 28 00:05:48.670465 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Jan 28 00:05:48.670541 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 28 00:05:48.670559 kernel: [drm] features: -context_init Jan 28 00:05:48.672535 kernel: [drm] number of scanouts: 1 Jan 28 00:05:48.672585 kernel: [drm] number of cap sets: 0 Jan 28 00:05:48.675235 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Jan 28 00:05:48.689556 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 28 00:05:48.690234 kernel: Console: switching to colour frame buffer device 160x50 Jan 28 00:05:48.691348 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 28 00:05:48.706487 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jan 28 00:05:48.706735 systemd[1]: Reloading finished in 241 ms. Jan 28 00:05:48.718994 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 28 00:05:48.720000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.723000 audit: BPF prog-id=43 op=LOAD Jan 28 00:05:48.723000 audit: BPF prog-id=34 op=UNLOAD Jan 28 00:05:48.723000 audit: BPF prog-id=44 op=LOAD Jan 28 00:05:48.723000 audit: BPF prog-id=45 op=LOAD Jan 28 00:05:48.723000 audit: BPF prog-id=35 op=UNLOAD Jan 28 00:05:48.723000 audit: BPF prog-id=36 op=UNLOAD Jan 28 00:05:48.724000 audit: BPF prog-id=46 op=LOAD Jan 28 00:05:48.724000 audit: BPF prog-id=31 op=UNLOAD Jan 28 00:05:48.724000 audit: BPF prog-id=47 op=LOAD Jan 28 00:05:48.724000 audit: BPF prog-id=48 op=LOAD Jan 28 00:05:48.724000 audit: BPF prog-id=32 op=UNLOAD Jan 28 00:05:48.724000 audit: BPF prog-id=33 op=UNLOAD Jan 28 00:05:48.724000 audit: BPF prog-id=49 op=LOAD Jan 28 00:05:48.724000 audit: BPF prog-id=50 op=LOAD Jan 28 00:05:48.724000 audit: BPF prog-id=28 op=UNLOAD Jan 28 00:05:48.725000 audit: BPF prog-id=29 op=UNLOAD Jan 28 00:05:48.725000 audit: BPF prog-id=51 op=LOAD Jan 28 00:05:48.725000 audit: BPF prog-id=40 op=UNLOAD Jan 28 00:05:48.725000 audit: BPF prog-id=52 op=LOAD Jan 28 00:05:48.725000 audit: BPF prog-id=53 op=LOAD Jan 28 00:05:48.725000 audit: BPF prog-id=41 op=UNLOAD Jan 28 00:05:48.725000 audit: BPF prog-id=42 op=UNLOAD Jan 28 00:05:48.731000 audit: BPF prog-id=54 op=LOAD Jan 28 00:05:48.731000 audit: BPF prog-id=37 op=UNLOAD Jan 28 00:05:48.731000 audit: BPF prog-id=55 op=LOAD Jan 28 00:05:48.731000 audit: BPF prog-id=56 op=LOAD Jan 28 00:05:48.731000 audit: BPF prog-id=38 op=UNLOAD Jan 28 00:05:48.731000 audit: BPF prog-id=39 op=UNLOAD Jan 28 00:05:48.732000 audit: BPF prog-id=57 op=LOAD Jan 28 00:05:48.732000 audit: BPF prog-id=30 op=UNLOAD Jan 28 00:05:48.738325 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 28 00:05:48.739000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.759592 systemd[1]: Finished ensure-sysext.service. Jan 28 00:05:48.760000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.783165 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 28 00:05:48.785711 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 28 00:05:48.786932 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 28 00:05:48.803304 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 28 00:05:48.805127 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 28 00:05:48.807486 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 28 00:05:48.810645 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 28 00:05:48.812607 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Jan 28 00:05:48.813804 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 28 00:05:48.813927 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 28 00:05:48.815537 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 28 00:05:48.817606 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 28 00:05:48.818831 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 28 00:05:48.821238 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 28 00:05:48.823000 audit: BPF prog-id=58 op=LOAD Jan 28 00:05:48.824913 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 28 00:05:48.826404 systemd[1]: Reached target time-set.target - System Time Set. Jan 28 00:05:48.828387 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 28 00:05:48.831283 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 28 00:05:48.832452 kernel: pps_core: LinuxPPS API ver. 1 registered Jan 28 00:05:48.832517 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Jan 28 00:05:48.834469 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 28 00:05:48.836000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.836000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.835360 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 28 00:05:48.836772 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 28 00:05:48.839247 kernel: PTP clock support registered Jan 28 00:05:48.839352 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 28 00:05:48.840000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.840000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.840731 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 28 00:05:48.840900 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 28 00:05:48.842000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.842000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.842782 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 28 00:05:48.843000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.843118 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 28 00:05:48.847000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.847531 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Jan 28 00:05:48.847705 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Jan 28 00:05:48.848000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.848000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.849289 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 28 00:05:48.850000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.857625 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 28 00:05:48.857700 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 28 00:05:48.867000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.867000 audit[1577]: SYSTEM_BOOT pid=1577 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.866315 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 28 00:05:48.873588 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 28 00:05:48.874000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:05:48.886000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 28 00:05:48.886000 audit[1606]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffe78df1b0 a2=420 a3=0 items=0 ppid=1560 pid=1606 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:05:48.886000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 28 00:05:48.887692 augenrules[1606]: No rules Jan 28 00:05:48.889472 systemd[1]: audit-rules.service: Deactivated successfully. Jan 28 00:05:48.889718 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 28 00:05:48.916375 systemd-networkd[1576]: lo: Link UP Jan 28 00:05:48.916384 systemd-networkd[1576]: lo: Gained carrier Jan 28 00:05:48.917441 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 28 00:05:48.917847 systemd-networkd[1576]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 28 00:05:48.917851 systemd-networkd[1576]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 28 00:05:48.918529 systemd-networkd[1576]: eth0: Link UP Jan 28 00:05:48.918653 systemd-networkd[1576]: eth0: Gained carrier Jan 28 00:05:48.918667 systemd-networkd[1576]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 28 00:05:48.918947 systemd[1]: Reached target network.target - Network. Jan 28 00:05:48.922368 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 28 00:05:48.924838 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 28 00:05:48.937267 systemd-networkd[1576]: eth0: DHCPv4 address 10.0.1.105/25, gateway 10.0.1.1 acquired from 10.0.1.1 Jan 28 00:05:48.949268 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 28 00:05:48.983787 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 28 00:05:49.006660 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 28 00:05:49.010550 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 28 00:05:49.424872 ldconfig[1568]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 28 00:05:49.430377 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 28 00:05:49.432786 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 28 00:05:49.457036 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 28 00:05:49.458384 systemd[1]: Reached target sysinit.target - System Initialization. Jan 28 00:05:49.459420 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 28 00:05:49.460518 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 28 00:05:49.461805 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 28 00:05:49.462948 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 28 00:05:49.464129 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 28 00:05:49.465519 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 28 00:05:49.466515 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 28 00:05:49.467619 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 28 00:05:49.467652 systemd[1]: Reached target paths.target - Path Units. Jan 28 00:05:49.468507 systemd[1]: Reached target timers.target - Timer Units. Jan 28 00:05:49.470334 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 28 00:05:49.472431 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 28 00:05:49.474966 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 28 00:05:49.476405 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 28 00:05:49.477509 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 28 00:05:49.495270 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 28 00:05:49.496438 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 28 00:05:49.498027 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 28 00:05:49.499125 systemd[1]: Reached target sockets.target - Socket Units. Jan 28 00:05:49.500043 systemd[1]: Reached target basic.target - Basic System. Jan 28 00:05:49.500962 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 28 00:05:49.500994 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 28 00:05:49.503805 systemd[1]: Starting chronyd.service - NTP client/server... Jan 28 00:05:49.505457 systemd[1]: Starting containerd.service - containerd container runtime... Jan 28 00:05:49.507519 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 28 00:05:49.510360 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 28 00:05:49.512189 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 28 00:05:49.515109 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 28 00:05:49.515234 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 00:05:49.517071 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 28 00:05:49.518615 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 28 00:05:49.521370 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 28 00:05:49.523964 jq[1631]: false Jan 28 00:05:49.525138 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 28 00:05:49.527095 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 28 00:05:49.531378 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 28 00:05:49.536770 extend-filesystems[1632]: Found /dev/vda6 Jan 28 00:05:49.540757 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 28 00:05:49.543005 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 28 00:05:49.543407 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 28 00:05:49.543930 systemd[1]: Starting update-engine.service - Update Engine... Jan 28 00:05:49.547141 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 28 00:05:49.548328 extend-filesystems[1632]: Found /dev/vda9 Jan 28 00:05:49.551855 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 28 00:05:49.553665 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 28 00:05:49.553878 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 28 00:05:49.554606 extend-filesystems[1632]: Checking size of /dev/vda9 Jan 28 00:05:49.555147 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 28 00:05:49.555376 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 28 00:05:49.562136 systemd[1]: motdgen.service: Deactivated successfully. Jan 28 00:05:49.562349 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 28 00:05:49.566258 jq[1644]: true Jan 28 00:05:49.576299 chronyd[1624]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Jan 28 00:05:49.577697 chronyd[1624]: Loaded seccomp filter (level 2) Jan 28 00:05:49.578458 systemd[1]: Started chronyd.service - NTP client/server. Jan 28 00:05:49.582046 jq[1667]: true Jan 28 00:05:49.586201 extend-filesystems[1632]: Resized partition /dev/vda9 Jan 28 00:05:49.595220 extend-filesystems[1679]: resize2fs 1.47.3 (8-Jul-2025) Jan 28 00:05:49.606229 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Jan 28 00:05:49.606615 update_engine[1643]: I20260128 00:05:49.606379 1643 main.cc:92] Flatcar Update Engine starting Jan 28 00:05:49.608240 tar[1651]: linux-arm64/LICENSE Jan 28 00:05:49.608445 tar[1651]: linux-arm64/helm Jan 28 00:05:49.630424 dbus-daemon[1627]: [system] SELinux support is enabled Jan 28 00:05:49.630645 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 28 00:05:49.634292 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 28 00:05:49.634326 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 28 00:05:49.635223 update_engine[1643]: I20260128 00:05:49.635154 1643 update_check_scheduler.cc:74] Next update check in 8m51s Jan 28 00:05:49.636317 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 28 00:05:49.636341 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 28 00:05:49.640643 systemd[1]: Started update-engine.service - Update Engine. Jan 28 00:05:49.645219 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 28 00:05:49.653399 systemd-logind[1641]: New seat seat0. Jan 28 00:05:49.654494 systemd-logind[1641]: Watching system buttons on /dev/input/event0 (Power Button) Jan 28 00:05:49.654524 systemd-logind[1641]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jan 28 00:05:49.654746 systemd[1]: Started systemd-logind.service - User Login Management. Jan 28 00:05:49.696499 locksmithd[1695]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 28 00:05:49.733547 containerd[1660]: time="2026-01-28T00:05:49Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 28 00:05:49.734399 containerd[1660]: time="2026-01-28T00:05:49.734367880Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 28 00:05:49.744924 containerd[1660]: time="2026-01-28T00:05:49.744881080Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.76µs" Jan 28 00:05:49.744924 containerd[1660]: time="2026-01-28T00:05:49.744921360Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 28 00:05:49.752815 containerd[1660]: time="2026-01-28T00:05:49.744963800Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 28 00:05:49.752815 containerd[1660]: time="2026-01-28T00:05:49.744980560Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 28 00:05:49.753787 containerd[1660]: time="2026-01-28T00:05:49.753739840Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 28 00:05:49.753787 containerd[1660]: time="2026-01-28T00:05:49.753780400Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 28 00:05:49.753862 containerd[1660]: time="2026-01-28T00:05:49.753842280Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 28 00:05:49.753862 containerd[1660]: time="2026-01-28T00:05:49.753858000Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 28 00:05:49.754235 containerd[1660]: time="2026-01-28T00:05:49.754133240Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 28 00:05:49.754235 containerd[1660]: time="2026-01-28T00:05:49.754155280Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 28 00:05:49.754235 containerd[1660]: time="2026-01-28T00:05:49.754186920Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 28 00:05:49.754235 containerd[1660]: time="2026-01-28T00:05:49.754194760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 28 00:05:49.754388 containerd[1660]: time="2026-01-28T00:05:49.754345440Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 28 00:05:49.754388 containerd[1660]: time="2026-01-28T00:05:49.754370520Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 28 00:05:49.754456 containerd[1660]: time="2026-01-28T00:05:49.754442120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 28 00:05:49.754623 containerd[1660]: time="2026-01-28T00:05:49.754593120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 28 00:05:49.754654 containerd[1660]: time="2026-01-28T00:05:49.754624280Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 28 00:05:49.754654 containerd[1660]: time="2026-01-28T00:05:49.754634160Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 28 00:05:49.754710 containerd[1660]: time="2026-01-28T00:05:49.754664600Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 28 00:05:49.754950 containerd[1660]: time="2026-01-28T00:05:49.754881840Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 28 00:05:49.754986 containerd[1660]: time="2026-01-28T00:05:49.754959960Z" level=info msg="metadata content store policy set" policy=shared Jan 28 00:05:49.822825 containerd[1660]: time="2026-01-28T00:05:49.822775000Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 28 00:05:49.822905 containerd[1660]: time="2026-01-28T00:05:49.822834960Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 28 00:05:49.822929 bash[1696]: Updated "/home/core/.ssh/authorized_keys" Jan 28 00:05:49.824041 containerd[1660]: time="2026-01-28T00:05:49.822925160Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 28 00:05:49.824041 containerd[1660]: time="2026-01-28T00:05:49.822939840Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 28 00:05:49.824041 containerd[1660]: time="2026-01-28T00:05:49.822952400Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 28 00:05:49.824041 containerd[1660]: time="2026-01-28T00:05:49.822965160Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 28 00:05:49.824041 containerd[1660]: time="2026-01-28T00:05:49.822984880Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 28 00:05:49.824041 containerd[1660]: time="2026-01-28T00:05:49.822995360Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 28 00:05:49.824041 containerd[1660]: time="2026-01-28T00:05:49.823006440Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 28 00:05:49.824041 containerd[1660]: time="2026-01-28T00:05:49.823047880Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 28 00:05:49.824041 containerd[1660]: time="2026-01-28T00:05:49.823060200Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 28 00:05:49.824041 containerd[1660]: time="2026-01-28T00:05:49.823070440Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 28 00:05:49.824041 containerd[1660]: time="2026-01-28T00:05:49.823080680Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 28 00:05:49.824041 containerd[1660]: time="2026-01-28T00:05:49.823092280Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 28 00:05:49.824041 containerd[1660]: time="2026-01-28T00:05:49.823230880Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 28 00:05:49.824683 containerd[1660]: time="2026-01-28T00:05:49.823268000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 28 00:05:49.824683 containerd[1660]: time="2026-01-28T00:05:49.823285000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 28 00:05:49.824683 containerd[1660]: time="2026-01-28T00:05:49.823295320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 28 00:05:49.824683 containerd[1660]: time="2026-01-28T00:05:49.823305880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 28 00:05:49.824683 containerd[1660]: time="2026-01-28T00:05:49.823315880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 28 00:05:49.824683 containerd[1660]: time="2026-01-28T00:05:49.823328040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 28 00:05:49.824683 containerd[1660]: time="2026-01-28T00:05:49.823340200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 28 00:05:49.824683 containerd[1660]: time="2026-01-28T00:05:49.823350760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 28 00:05:49.824683 containerd[1660]: time="2026-01-28T00:05:49.823361560Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 28 00:05:49.824683 containerd[1660]: time="2026-01-28T00:05:49.823371440Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 28 00:05:49.824683 containerd[1660]: time="2026-01-28T00:05:49.823403600Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 28 00:05:49.824683 containerd[1660]: time="2026-01-28T00:05:49.823441520Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 28 00:05:49.824683 containerd[1660]: time="2026-01-28T00:05:49.823454360Z" level=info msg="Start snapshots syncer" Jan 28 00:05:49.824683 containerd[1660]: time="2026-01-28T00:05:49.823473120Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 28 00:05:49.824906 containerd[1660]: time="2026-01-28T00:05:49.823703080Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 28 00:05:49.824906 containerd[1660]: time="2026-01-28T00:05:49.823748960Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 28 00:05:49.824996 containerd[1660]: time="2026-01-28T00:05:49.823789320Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 28 00:05:49.824996 containerd[1660]: time="2026-01-28T00:05:49.823877320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 28 00:05:49.824996 containerd[1660]: time="2026-01-28T00:05:49.823898720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 28 00:05:49.824996 containerd[1660]: time="2026-01-28T00:05:49.823909400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 28 00:05:49.824996 containerd[1660]: time="2026-01-28T00:05:49.823918840Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 28 00:05:49.824996 containerd[1660]: time="2026-01-28T00:05:49.823956040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 28 00:05:49.824996 containerd[1660]: time="2026-01-28T00:05:49.823974880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 28 00:05:49.824996 containerd[1660]: time="2026-01-28T00:05:49.823985720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 28 00:05:49.824996 containerd[1660]: time="2026-01-28T00:05:49.823995680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 28 00:05:49.824996 containerd[1660]: time="2026-01-28T00:05:49.824006200Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 28 00:05:49.824996 containerd[1660]: time="2026-01-28T00:05:49.824034880Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 28 00:05:49.824996 containerd[1660]: time="2026-01-28T00:05:49.824062320Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 28 00:05:49.824996 containerd[1660]: time="2026-01-28T00:05:49.824071320Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 28 00:05:49.825199 containerd[1660]: time="2026-01-28T00:05:49.824081240Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 28 00:05:49.825199 containerd[1660]: time="2026-01-28T00:05:49.824089600Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 28 00:05:49.825199 containerd[1660]: time="2026-01-28T00:05:49.824099680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 28 00:05:49.825199 containerd[1660]: time="2026-01-28T00:05:49.824109640Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 28 00:05:49.825199 containerd[1660]: time="2026-01-28T00:05:49.824217400Z" level=info msg="runtime interface created" Jan 28 00:05:49.825199 containerd[1660]: time="2026-01-28T00:05:49.824224240Z" level=info msg="created NRI interface" Jan 28 00:05:49.825199 containerd[1660]: time="2026-01-28T00:05:49.824232440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 28 00:05:49.825199 containerd[1660]: time="2026-01-28T00:05:49.824245760Z" level=info msg="Connect containerd service" Jan 28 00:05:49.825199 containerd[1660]: time="2026-01-28T00:05:49.824266360Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 28 00:05:49.825199 containerd[1660]: time="2026-01-28T00:05:49.824851720Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 28 00:05:49.826534 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 28 00:05:49.831041 systemd[1]: Starting sshkeys.service... Jan 28 00:05:49.849138 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 28 00:05:49.852101 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 28 00:05:49.867237 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 00:05:49.922107 containerd[1660]: time="2026-01-28T00:05:49.921456160Z" level=info msg="Start subscribing containerd event" Jan 28 00:05:49.922107 containerd[1660]: time="2026-01-28T00:05:49.921519160Z" level=info msg="Start recovering state" Jan 28 00:05:49.922107 containerd[1660]: time="2026-01-28T00:05:49.921599080Z" level=info msg="Start event monitor" Jan 28 00:05:49.922107 containerd[1660]: time="2026-01-28T00:05:49.921614000Z" level=info msg="Start cni network conf syncer for default" Jan 28 00:05:49.922107 containerd[1660]: time="2026-01-28T00:05:49.921621160Z" level=info msg="Start streaming server" Jan 28 00:05:49.922107 containerd[1660]: time="2026-01-28T00:05:49.921632200Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 28 00:05:49.922107 containerd[1660]: time="2026-01-28T00:05:49.921638800Z" level=info msg="runtime interface starting up..." Jan 28 00:05:49.922107 containerd[1660]: time="2026-01-28T00:05:49.921644040Z" level=info msg="starting plugins..." Jan 28 00:05:49.922107 containerd[1660]: time="2026-01-28T00:05:49.921655960Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 28 00:05:49.922107 containerd[1660]: time="2026-01-28T00:05:49.921868240Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 28 00:05:49.922107 containerd[1660]: time="2026-01-28T00:05:49.921920440Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 28 00:05:49.922107 containerd[1660]: time="2026-01-28T00:05:49.922018560Z" level=info msg="containerd successfully booted in 0.188839s" Jan 28 00:05:49.922252 systemd[1]: Started containerd.service - containerd container runtime. Jan 28 00:05:49.957240 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Jan 28 00:05:49.973410 systemd-networkd[1576]: eth0: Gained IPv6LL Jan 28 00:05:49.980057 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 28 00:05:49.982084 systemd[1]: Reached target network-online.target - Network is Online. Jan 28 00:05:49.984616 extend-filesystems[1679]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 28 00:05:49.984616 extend-filesystems[1679]: old_desc_blocks = 1, new_desc_blocks = 6 Jan 28 00:05:49.984616 extend-filesystems[1679]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Jan 28 00:05:49.990242 extend-filesystems[1632]: Resized filesystem in /dev/vda9 Jan 28 00:05:49.986495 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 00:05:49.991298 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 28 00:05:49.993779 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 28 00:05:50.000248 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 28 00:05:50.038043 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 28 00:05:50.092750 tar[1651]: linux-arm64/README.md Jan 28 00:05:50.112745 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 28 00:05:50.135965 sshd_keygen[1659]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 28 00:05:50.154613 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 28 00:05:50.157413 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 28 00:05:50.180292 systemd[1]: issuegen.service: Deactivated successfully. Jan 28 00:05:50.182269 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 28 00:05:50.185674 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 28 00:05:50.205700 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 28 00:05:50.208742 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 28 00:05:50.211021 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jan 28 00:05:50.212419 systemd[1]: Reached target getty.target - Login Prompts. Jan 28 00:05:50.537252 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 00:05:50.881242 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 00:05:50.892396 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 00:05:50.896162 (kubelet)[1769]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 28 00:05:51.450258 kubelet[1769]: E0128 00:05:51.450169 1769 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 28 00:05:51.452828 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 28 00:05:51.452950 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 28 00:05:51.454335 systemd[1]: kubelet.service: Consumed 784ms CPU time, 258.3M memory peak. Jan 28 00:05:52.545248 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 00:05:52.888272 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 00:05:56.552263 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 00:05:56.566224 coreos-metadata[1626]: Jan 28 00:05:56.563 WARN failed to locate config-drive, using the metadata service API instead Jan 28 00:05:56.580383 coreos-metadata[1626]: Jan 28 00:05:56.580 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 28 00:05:56.897270 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jan 28 00:05:56.903178 coreos-metadata[1718]: Jan 28 00:05:56.903 WARN failed to locate config-drive, using the metadata service API instead Jan 28 00:05:56.916055 coreos-metadata[1718]: Jan 28 00:05:56.916 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 28 00:06:00.135413 coreos-metadata[1718]: Jan 28 00:06:00.135 INFO Fetch successful Jan 28 00:06:00.135730 coreos-metadata[1718]: Jan 28 00:06:00.135 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 28 00:06:00.721109 coreos-metadata[1626]: Jan 28 00:06:00.720 INFO Fetch successful Jan 28 00:06:00.721467 coreos-metadata[1626]: Jan 28 00:06:00.721 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 28 00:06:01.372641 coreos-metadata[1718]: Jan 28 00:06:01.372 INFO Fetch successful Jan 28 00:06:01.374754 unknown[1718]: wrote ssh authorized keys file for user: core Jan 28 00:06:01.380250 coreos-metadata[1626]: Jan 28 00:06:01.380 INFO Fetch successful Jan 28 00:06:01.380250 coreos-metadata[1626]: Jan 28 00:06:01.380 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 28 00:06:01.404367 update-ssh-keys[1789]: Updated "/home/core/.ssh/authorized_keys" Jan 28 00:06:01.407301 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 28 00:06:01.408983 systemd[1]: Finished sshkeys.service. Jan 28 00:06:01.703563 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 28 00:06:01.705015 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 00:06:01.855395 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 00:06:01.858877 (kubelet)[1800]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 28 00:06:01.891791 kubelet[1800]: E0128 00:06:01.891738 1800 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 28 00:06:01.895172 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 28 00:06:01.895326 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 28 00:06:01.895687 systemd[1]: kubelet.service: Consumed 137ms CPU time, 107.3M memory peak. Jan 28 00:06:02.011445 coreos-metadata[1626]: Jan 28 00:06:02.011 INFO Fetch successful Jan 28 00:06:02.011445 coreos-metadata[1626]: Jan 28 00:06:02.011 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 28 00:06:02.638825 coreos-metadata[1626]: Jan 28 00:06:02.638 INFO Fetch successful Jan 28 00:06:02.638825 coreos-metadata[1626]: Jan 28 00:06:02.638 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 28 00:06:03.262656 coreos-metadata[1626]: Jan 28 00:06:03.262 INFO Fetch successful Jan 28 00:06:03.262656 coreos-metadata[1626]: Jan 28 00:06:03.262 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 28 00:06:03.887364 coreos-metadata[1626]: Jan 28 00:06:03.887 INFO Fetch successful Jan 28 00:06:03.911017 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 28 00:06:03.911423 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 28 00:06:03.912127 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 28 00:06:03.912462 systemd[1]: Startup finished in 2.457s (kernel) + 13.768s (initrd) + 17.008s (userspace) = 33.234s. Jan 28 00:06:12.146240 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 28 00:06:12.147558 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 00:06:12.258930 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 00:06:12.262964 (kubelet)[1822]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 28 00:06:12.306429 kubelet[1822]: E0128 00:06:12.306367 1822 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 28 00:06:12.309153 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 28 00:06:12.309298 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 28 00:06:12.309676 systemd[1]: kubelet.service: Consumed 140ms CPU time, 106.3M memory peak. Jan 28 00:06:13.361588 chronyd[1624]: Selected source PHC0 Jan 28 00:06:14.170927 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 28 00:06:14.171965 systemd[1]: Started sshd@0-10.0.1.105:22-4.153.228.146:43370.service - OpenSSH per-connection server daemon (4.153.228.146:43370). Jan 28 00:06:14.663102 sshd[1832]: Accepted publickey for core from 4.153.228.146 port 43370 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:06:14.665519 sshd-session[1832]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:06:14.671191 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 28 00:06:14.671982 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 28 00:06:14.675319 systemd-logind[1641]: New session 1 of user core. Jan 28 00:06:14.688585 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 28 00:06:14.690784 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 28 00:06:14.706991 (systemd)[1838]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:06:14.709331 systemd-logind[1641]: New session 2 of user core. Jan 28 00:06:14.826710 systemd[1838]: Queued start job for default target default.target. Jan 28 00:06:14.845367 systemd[1838]: Created slice app.slice - User Application Slice. Jan 28 00:06:14.845559 systemd[1838]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 28 00:06:14.845574 systemd[1838]: Reached target paths.target - Paths. Jan 28 00:06:14.845626 systemd[1838]: Reached target timers.target - Timers. Jan 28 00:06:14.846692 systemd[1838]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 28 00:06:14.847353 systemd[1838]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 28 00:06:14.855433 systemd[1838]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 28 00:06:14.855480 systemd[1838]: Reached target sockets.target - Sockets. Jan 28 00:06:14.856014 systemd[1838]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 28 00:06:14.856069 systemd[1838]: Reached target basic.target - Basic System. Jan 28 00:06:14.856110 systemd[1838]: Reached target default.target - Main User Target. Jan 28 00:06:14.856149 systemd[1838]: Startup finished in 142ms. Jan 28 00:06:14.856344 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 28 00:06:14.863574 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 28 00:06:15.140248 systemd[1]: Started sshd@1-10.0.1.105:22-4.153.228.146:59516.service - OpenSSH per-connection server daemon (4.153.228.146:59516). Jan 28 00:06:15.623302 sshd[1852]: Accepted publickey for core from 4.153.228.146 port 59516 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:06:15.624486 sshd-session[1852]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:06:15.628231 systemd-logind[1641]: New session 3 of user core. Jan 28 00:06:15.640417 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 28 00:06:15.887024 sshd[1856]: Connection closed by 4.153.228.146 port 59516 Jan 28 00:06:15.887415 sshd-session[1852]: pam_unix(sshd:session): session closed for user core Jan 28 00:06:15.891602 systemd[1]: sshd@1-10.0.1.105:22-4.153.228.146:59516.service: Deactivated successfully. Jan 28 00:06:15.892980 systemd[1]: session-3.scope: Deactivated successfully. Jan 28 00:06:15.893623 systemd-logind[1641]: Session 3 logged out. Waiting for processes to exit. Jan 28 00:06:15.894363 systemd-logind[1641]: Removed session 3. Jan 28 00:06:15.987062 systemd[1]: Started sshd@2-10.0.1.105:22-4.153.228.146:59530.service - OpenSSH per-connection server daemon (4.153.228.146:59530). Jan 28 00:06:16.478935 sshd[1862]: Accepted publickey for core from 4.153.228.146 port 59530 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:06:16.480080 sshd-session[1862]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:06:16.483355 systemd-logind[1641]: New session 4 of user core. Jan 28 00:06:16.497512 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 28 00:06:16.739646 sshd[1866]: Connection closed by 4.153.228.146 port 59530 Jan 28 00:06:16.740263 sshd-session[1862]: pam_unix(sshd:session): session closed for user core Jan 28 00:06:16.743620 systemd[1]: sshd@2-10.0.1.105:22-4.153.228.146:59530.service: Deactivated successfully. Jan 28 00:06:16.744971 systemd[1]: session-4.scope: Deactivated successfully. Jan 28 00:06:16.745550 systemd-logind[1641]: Session 4 logged out. Waiting for processes to exit. Jan 28 00:06:16.746334 systemd-logind[1641]: Removed session 4. Jan 28 00:06:16.840113 systemd[1]: Started sshd@3-10.0.1.105:22-4.153.228.146:59534.service - OpenSSH per-connection server daemon (4.153.228.146:59534). Jan 28 00:06:17.326007 sshd[1872]: Accepted publickey for core from 4.153.228.146 port 59534 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:06:17.327214 sshd-session[1872]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:06:17.330923 systemd-logind[1641]: New session 5 of user core. Jan 28 00:06:17.338325 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 28 00:06:17.590282 sshd[1876]: Connection closed by 4.153.228.146 port 59534 Jan 28 00:06:17.589761 sshd-session[1872]: pam_unix(sshd:session): session closed for user core Jan 28 00:06:17.593732 systemd[1]: sshd@3-10.0.1.105:22-4.153.228.146:59534.service: Deactivated successfully. Jan 28 00:06:17.595119 systemd[1]: session-5.scope: Deactivated successfully. Jan 28 00:06:17.597852 systemd-logind[1641]: Session 5 logged out. Waiting for processes to exit. Jan 28 00:06:17.598630 systemd-logind[1641]: Removed session 5. Jan 28 00:06:17.684940 systemd[1]: Started sshd@4-10.0.1.105:22-4.153.228.146:59536.service - OpenSSH per-connection server daemon (4.153.228.146:59536). Jan 28 00:06:18.170231 sshd[1882]: Accepted publickey for core from 4.153.228.146 port 59536 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:06:18.171223 sshd-session[1882]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:06:18.175322 systemd-logind[1641]: New session 6 of user core. Jan 28 00:06:18.188552 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 28 00:06:18.362654 sudo[1887]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 28 00:06:18.362878 sudo[1887]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 28 00:06:18.382118 sudo[1887]: pam_unix(sudo:session): session closed for user root Jan 28 00:06:18.470297 sshd[1886]: Connection closed by 4.153.228.146 port 59536 Jan 28 00:06:18.469392 sshd-session[1882]: pam_unix(sshd:session): session closed for user core Jan 28 00:06:18.473258 systemd[1]: sshd@4-10.0.1.105:22-4.153.228.146:59536.service: Deactivated successfully. Jan 28 00:06:18.474827 systemd[1]: session-6.scope: Deactivated successfully. Jan 28 00:06:18.477463 systemd-logind[1641]: Session 6 logged out. Waiting for processes to exit. Jan 28 00:06:18.478161 systemd-logind[1641]: Removed session 6. Jan 28 00:06:18.573343 systemd[1]: Started sshd@5-10.0.1.105:22-4.153.228.146:59550.service - OpenSSH per-connection server daemon (4.153.228.146:59550). Jan 28 00:06:19.074259 sshd[1894]: Accepted publickey for core from 4.153.228.146 port 59550 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:06:19.075427 sshd-session[1894]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:06:19.079112 systemd-logind[1641]: New session 7 of user core. Jan 28 00:06:19.093937 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 28 00:06:19.269191 sudo[1900]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 28 00:06:19.269466 sudo[1900]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 28 00:06:19.279062 sudo[1900]: pam_unix(sudo:session): session closed for user root Jan 28 00:06:19.285025 sudo[1899]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 28 00:06:19.285335 sudo[1899]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 28 00:06:19.292979 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 28 00:06:19.333254 kernel: kauditd_printk_skb: 186 callbacks suppressed Jan 28 00:06:19.333342 kernel: audit: type=1305 audit(1769558779.330:230): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 28 00:06:19.330000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 28 00:06:19.333445 augenrules[1924]: No rules Jan 28 00:06:19.330000 audit[1924]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffe0ab03c0 a2=420 a3=0 items=0 ppid=1905 pid=1924 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:19.336079 systemd[1]: audit-rules.service: Deactivated successfully. Jan 28 00:06:19.336332 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 28 00:06:19.337246 sudo[1899]: pam_unix(sudo:session): session closed for user root Jan 28 00:06:19.339128 kernel: audit: type=1300 audit(1769558779.330:230): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffe0ab03c0 a2=420 a3=0 items=0 ppid=1905 pid=1924 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:19.339179 kernel: audit: type=1327 audit(1769558779.330:230): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 28 00:06:19.330000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 28 00:06:19.334000 audit[1899]: USER_END pid=1899 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 00:06:19.343512 kernel: audit: type=1106 audit(1769558779.334:231): pid=1899 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 00:06:19.334000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:19.346013 kernel: audit: type=1130 audit(1769558779.334:232): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:19.334000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:19.348537 kernel: audit: type=1131 audit(1769558779.334:233): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:19.334000 audit[1899]: CRED_DISP pid=1899 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 00:06:19.351048 kernel: audit: type=1104 audit(1769558779.334:234): pid=1899 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 00:06:19.431983 sshd[1898]: Connection closed by 4.153.228.146 port 59550 Jan 28 00:06:19.432802 sshd-session[1894]: pam_unix(sshd:session): session closed for user core Jan 28 00:06:19.432000 audit[1894]: USER_END pid=1894 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:06:19.436537 systemd[1]: sshd@5-10.0.1.105:22-4.153.228.146:59550.service: Deactivated successfully. Jan 28 00:06:19.432000 audit[1894]: CRED_DISP pid=1894 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:06:19.438526 systemd[1]: session-7.scope: Deactivated successfully. Jan 28 00:06:19.439944 systemd-logind[1641]: Session 7 logged out. Waiting for processes to exit. Jan 28 00:06:19.440751 systemd-logind[1641]: Removed session 7. Jan 28 00:06:19.441028 kernel: audit: type=1106 audit(1769558779.432:235): pid=1894 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:06:19.441067 kernel: audit: type=1104 audit(1769558779.432:236): pid=1894 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:06:19.441083 kernel: audit: type=1131 audit(1769558779.434:237): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-10.0.1.105:22-4.153.228.146:59550 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:19.434000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-10.0.1.105:22-4.153.228.146:59550 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:19.540000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.1.105:22-4.153.228.146:59562 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:19.541171 systemd[1]: Started sshd@6-10.0.1.105:22-4.153.228.146:59562.service - OpenSSH per-connection server daemon (4.153.228.146:59562). Jan 28 00:06:20.050000 audit[1933]: USER_ACCT pid=1933 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:06:20.051979 sshd[1933]: Accepted publickey for core from 4.153.228.146 port 59562 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:06:20.051000 audit[1933]: CRED_ACQ pid=1933 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:06:20.051000 audit[1933]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffebdc6160 a2=3 a3=0 items=0 ppid=1 pid=1933 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:20.051000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:06:20.053238 sshd-session[1933]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:06:20.057410 systemd-logind[1641]: New session 8 of user core. Jan 28 00:06:20.067585 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 28 00:06:20.068000 audit[1933]: USER_START pid=1933 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:06:20.070000 audit[1937]: CRED_ACQ pid=1937 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:06:20.245000 audit[1938]: USER_ACCT pid=1938 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 00:06:20.245000 audit[1938]: CRED_REFR pid=1938 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 00:06:20.247042 sudo[1938]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 28 00:06:20.246000 audit[1938]: USER_START pid=1938 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 00:06:20.247327 sudo[1938]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 28 00:06:20.576065 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 28 00:06:20.591264 (dockerd)[1959]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 28 00:06:20.845264 dockerd[1959]: time="2026-01-28T00:06:20.845124519Z" level=info msg="Starting up" Jan 28 00:06:20.846329 dockerd[1959]: time="2026-01-28T00:06:20.846300562Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 28 00:06:20.856386 dockerd[1959]: time="2026-01-28T00:06:20.856349793Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 28 00:06:20.893170 dockerd[1959]: time="2026-01-28T00:06:20.893107705Z" level=info msg="Loading containers: start." Jan 28 00:06:20.904520 kernel: Initializing XFRM netlink socket Jan 28 00:06:20.953000 audit[2010]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2010 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:20.953000 audit[2010]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffc6d52490 a2=0 a3=0 items=0 ppid=1959 pid=2010 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:20.953000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 28 00:06:20.955000 audit[2012]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2012 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:20.955000 audit[2012]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffca9a9280 a2=0 a3=0 items=0 ppid=1959 pid=2012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:20.955000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 28 00:06:20.958000 audit[2014]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2014 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:20.958000 audit[2014]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc01498a0 a2=0 a3=0 items=0 ppid=1959 pid=2014 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:20.958000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 28 00:06:20.959000 audit[2016]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2016 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:20.959000 audit[2016]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe5bdfb40 a2=0 a3=0 items=0 ppid=1959 pid=2016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:20.959000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 28 00:06:20.961000 audit[2018]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2018 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:20.961000 audit[2018]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffff5f462a0 a2=0 a3=0 items=0 ppid=1959 pid=2018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:20.961000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 28 00:06:20.963000 audit[2020]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2020 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:20.963000 audit[2020]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffc8ad4750 a2=0 a3=0 items=0 ppid=1959 pid=2020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:20.963000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 28 00:06:20.965000 audit[2022]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2022 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:20.965000 audit[2022]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffc4c06c30 a2=0 a3=0 items=0 ppid=1959 pid=2022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:20.965000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 28 00:06:20.966000 audit[2024]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2024 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:20.966000 audit[2024]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=fffff4d2da40 a2=0 a3=0 items=0 ppid=1959 pid=2024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:20.966000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 28 00:06:20.996000 audit[2027]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2027 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:20.996000 audit[2027]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=472 a0=3 a1=ffffc978e270 a2=0 a3=0 items=0 ppid=1959 pid=2027 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:20.996000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 28 00:06:20.998000 audit[2029]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2029 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:20.998000 audit[2029]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=fffffe41e2b0 a2=0 a3=0 items=0 ppid=1959 pid=2029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:20.998000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 28 00:06:21.000000 audit[2031]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2031 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:21.000000 audit[2031]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffe005adc0 a2=0 a3=0 items=0 ppid=1959 pid=2031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.000000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 28 00:06:21.001000 audit[2033]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2033 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:21.001000 audit[2033]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffcf690e40 a2=0 a3=0 items=0 ppid=1959 pid=2033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.001000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 28 00:06:21.003000 audit[2035]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2035 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:21.003000 audit[2035]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffc0439c40 a2=0 a3=0 items=0 ppid=1959 pid=2035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.003000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 28 00:06:21.035000 audit[2065]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2065 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:21.035000 audit[2065]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffddfb2360 a2=0 a3=0 items=0 ppid=1959 pid=2065 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.035000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 28 00:06:21.037000 audit[2067]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2067 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:21.037000 audit[2067]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffd731eab0 a2=0 a3=0 items=0 ppid=1959 pid=2067 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.037000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 28 00:06:21.038000 audit[2069]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2069 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:21.038000 audit[2069]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe045e520 a2=0 a3=0 items=0 ppid=1959 pid=2069 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.038000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 28 00:06:21.040000 audit[2071]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2071 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:21.040000 audit[2071]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd9533850 a2=0 a3=0 items=0 ppid=1959 pid=2071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.040000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 28 00:06:21.041000 audit[2073]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2073 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:21.041000 audit[2073]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc1b4df70 a2=0 a3=0 items=0 ppid=1959 pid=2073 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.041000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 28 00:06:21.043000 audit[2075]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2075 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:21.043000 audit[2075]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffc0652300 a2=0 a3=0 items=0 ppid=1959 pid=2075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.043000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 28 00:06:21.045000 audit[2077]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2077 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:21.045000 audit[2077]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffc1805f40 a2=0 a3=0 items=0 ppid=1959 pid=2077 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.045000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 28 00:06:21.047000 audit[2079]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2079 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:21.047000 audit[2079]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffddc63f80 a2=0 a3=0 items=0 ppid=1959 pid=2079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.047000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 28 00:06:21.049000 audit[2081]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2081 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:21.049000 audit[2081]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=484 a0=3 a1=ffffdc46ab10 a2=0 a3=0 items=0 ppid=1959 pid=2081 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.049000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 28 00:06:21.050000 audit[2083]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2083 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:21.050000 audit[2083]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffcae4cd30 a2=0 a3=0 items=0 ppid=1959 pid=2083 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.050000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 28 00:06:21.052000 audit[2085]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2085 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:21.052000 audit[2085]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffea17fff0 a2=0 a3=0 items=0 ppid=1959 pid=2085 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.052000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 28 00:06:21.053000 audit[2087]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2087 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:21.053000 audit[2087]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffea21be00 a2=0 a3=0 items=0 ppid=1959 pid=2087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.053000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 28 00:06:21.055000 audit[2089]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2089 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:21.055000 audit[2089]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffd08a3640 a2=0 a3=0 items=0 ppid=1959 pid=2089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.055000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 28 00:06:21.060000 audit[2094]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2094 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:21.060000 audit[2094]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd9f422b0 a2=0 a3=0 items=0 ppid=1959 pid=2094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.060000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 28 00:06:21.061000 audit[2096]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2096 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:21.061000 audit[2096]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=fffffac302b0 a2=0 a3=0 items=0 ppid=1959 pid=2096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.061000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 28 00:06:21.063000 audit[2098]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2098 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:21.063000 audit[2098]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffe72845d0 a2=0 a3=0 items=0 ppid=1959 pid=2098 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.063000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 28 00:06:21.065000 audit[2100]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2100 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:21.065000 audit[2100]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffff7b3f520 a2=0 a3=0 items=0 ppid=1959 pid=2100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.065000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 28 00:06:21.066000 audit[2102]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2102 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:21.066000 audit[2102]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffed4aa2b0 a2=0 a3=0 items=0 ppid=1959 pid=2102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.066000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 28 00:06:21.068000 audit[2104]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2104 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:21.068000 audit[2104]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffd0170fc0 a2=0 a3=0 items=0 ppid=1959 pid=2104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.068000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 28 00:06:21.091000 audit[2109]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2109 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:21.091000 audit[2109]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=520 a0=3 a1=ffffd6fd6900 a2=0 a3=0 items=0 ppid=1959 pid=2109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.091000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 28 00:06:21.093000 audit[2111]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2111 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:21.093000 audit[2111]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=ffffdea8e8c0 a2=0 a3=0 items=0 ppid=1959 pid=2111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.093000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 28 00:06:21.099000 audit[2119]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2119 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:21.099000 audit[2119]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=300 a0=3 a1=fffff3b88330 a2=0 a3=0 items=0 ppid=1959 pid=2119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.099000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 28 00:06:21.109000 audit[2125]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2125 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:21.109000 audit[2125]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=ffffcfe58880 a2=0 a3=0 items=0 ppid=1959 pid=2125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.109000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 28 00:06:21.111000 audit[2127]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2127 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:21.111000 audit[2127]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=512 a0=3 a1=ffffdb340910 a2=0 a3=0 items=0 ppid=1959 pid=2127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.111000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 28 00:06:21.113000 audit[2129]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2129 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:21.113000 audit[2129]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffe657ad30 a2=0 a3=0 items=0 ppid=1959 pid=2129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.113000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 28 00:06:21.115000 audit[2131]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2131 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:21.115000 audit[2131]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=fffff6bf6e00 a2=0 a3=0 items=0 ppid=1959 pid=2131 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.115000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 28 00:06:21.117000 audit[2133]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2133 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:21.117000 audit[2133]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffe05a8040 a2=0 a3=0 items=0 ppid=1959 pid=2133 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:21.117000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 28 00:06:21.118779 systemd-networkd[1576]: docker0: Link UP Jan 28 00:06:21.122948 dockerd[1959]: time="2026-01-28T00:06:21.122899042Z" level=info msg="Loading containers: done." Jan 28 00:06:21.134492 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1507456083-merged.mount: Deactivated successfully. Jan 28 00:06:21.150945 dockerd[1959]: time="2026-01-28T00:06:21.150892927Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 28 00:06:21.151070 dockerd[1959]: time="2026-01-28T00:06:21.150988688Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 28 00:06:21.151220 dockerd[1959]: time="2026-01-28T00:06:21.151175768Z" level=info msg="Initializing buildkit" Jan 28 00:06:21.171100 dockerd[1959]: time="2026-01-28T00:06:21.171047429Z" level=info msg="Completed buildkit initialization" Jan 28 00:06:21.178649 dockerd[1959]: time="2026-01-28T00:06:21.178576172Z" level=info msg="Daemon has completed initialization" Jan 28 00:06:21.178766 dockerd[1959]: time="2026-01-28T00:06:21.178634332Z" level=info msg="API listen on /run/docker.sock" Jan 28 00:06:21.177000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:21.179014 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 28 00:06:22.421516 containerd[1660]: time="2026-01-28T00:06:22.421482020Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\"" Jan 28 00:06:22.560079 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 28 00:06:22.561448 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 00:06:22.685000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:22.686224 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 00:06:22.689647 (kubelet)[2183]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 28 00:06:22.719072 kubelet[2183]: E0128 00:06:22.719020 2183 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 28 00:06:22.721435 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 28 00:06:22.721559 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 28 00:06:22.720000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 28 00:06:22.721879 systemd[1]: kubelet.service: Consumed 129ms CPU time, 107.4M memory peak. Jan 28 00:06:23.032020 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2577408681.mount: Deactivated successfully. Jan 28 00:06:24.834269 containerd[1660]: time="2026-01-28T00:06:24.834005988Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:24.836202 containerd[1660]: time="2026-01-28T00:06:24.836161314Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.7: active requests=0, bytes read=25791094" Jan 28 00:06:24.838221 containerd[1660]: time="2026-01-28T00:06:24.838182800Z" level=info msg="ImageCreate event name:\"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:24.841577 containerd[1660]: time="2026-01-28T00:06:24.841543531Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:24.843401 containerd[1660]: time="2026-01-28T00:06:24.843266936Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.7\" with image id \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\", size \"27383880\" in 2.421749076s" Jan 28 00:06:24.843401 containerd[1660]: time="2026-01-28T00:06:24.843299416Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\" returns image reference \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\"" Jan 28 00:06:24.844827 containerd[1660]: time="2026-01-28T00:06:24.844785580Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\"" Jan 28 00:06:26.073177 containerd[1660]: time="2026-01-28T00:06:26.073069671Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:26.075263 containerd[1660]: time="2026-01-28T00:06:26.075193718Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.7: active requests=0, bytes read=23547679" Jan 28 00:06:26.077185 containerd[1660]: time="2026-01-28T00:06:26.077133123Z" level=info msg="ImageCreate event name:\"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:26.080718 containerd[1660]: time="2026-01-28T00:06:26.080661334Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:26.081607 containerd[1660]: time="2026-01-28T00:06:26.081561217Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.7\" with image id \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\", size \"25137562\" in 1.236664876s" Jan 28 00:06:26.081607 containerd[1660]: time="2026-01-28T00:06:26.081600977Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\" returns image reference \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\"" Jan 28 00:06:26.082263 containerd[1660]: time="2026-01-28T00:06:26.082080059Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\"" Jan 28 00:06:27.187381 containerd[1660]: time="2026-01-28T00:06:27.187310535Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:27.189710 containerd[1660]: time="2026-01-28T00:06:27.189452942Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.7: active requests=0, bytes read=18289931" Jan 28 00:06:27.191088 containerd[1660]: time="2026-01-28T00:06:27.191054547Z" level=info msg="ImageCreate event name:\"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:27.194450 containerd[1660]: time="2026-01-28T00:06:27.194401677Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:27.195114 containerd[1660]: time="2026-01-28T00:06:27.195087199Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.7\" with image id \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\", size \"19882566\" in 1.11295382s" Jan 28 00:06:27.195161 containerd[1660]: time="2026-01-28T00:06:27.195120279Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\" returns image reference \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\"" Jan 28 00:06:27.195562 containerd[1660]: time="2026-01-28T00:06:27.195533360Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\"" Jan 28 00:06:28.184929 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2494212920.mount: Deactivated successfully. Jan 28 00:06:28.432116 containerd[1660]: time="2026-01-28T00:06:28.432054156Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:28.433778 containerd[1660]: time="2026-01-28T00:06:28.433727521Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.7: active requests=0, bytes read=18413667" Jan 28 00:06:28.434634 containerd[1660]: time="2026-01-28T00:06:28.434601924Z" level=info msg="ImageCreate event name:\"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:28.436892 containerd[1660]: time="2026-01-28T00:06:28.436765930Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:28.437564 containerd[1660]: time="2026-01-28T00:06:28.437494653Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.7\" with image id \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\", repo tag \"registry.k8s.io/kube-proxy:v1.33.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\", size \"28257692\" in 1.241924692s" Jan 28 00:06:28.437564 containerd[1660]: time="2026-01-28T00:06:28.437559813Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\" returns image reference \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\"" Jan 28 00:06:28.438091 containerd[1660]: time="2026-01-28T00:06:28.438054934Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Jan 28 00:06:28.986788 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3911021112.mount: Deactivated successfully. Jan 28 00:06:29.696154 containerd[1660]: time="2026-01-28T00:06:29.696104995Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:29.699240 containerd[1660]: time="2026-01-28T00:06:29.698843043Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=18338344" Jan 28 00:06:29.700800 containerd[1660]: time="2026-01-28T00:06:29.700746689Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:29.704244 containerd[1660]: time="2026-01-28T00:06:29.704178539Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:29.706074 containerd[1660]: time="2026-01-28T00:06:29.706044545Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.267958491s" Jan 28 00:06:29.706132 containerd[1660]: time="2026-01-28T00:06:29.706077585Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Jan 28 00:06:29.706468 containerd[1660]: time="2026-01-28T00:06:29.706446706Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 28 00:06:30.271960 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1286635019.mount: Deactivated successfully. Jan 28 00:06:30.281566 containerd[1660]: time="2026-01-28T00:06:30.281522572Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 28 00:06:30.283107 containerd[1660]: time="2026-01-28T00:06:30.283049336Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 28 00:06:30.284225 containerd[1660]: time="2026-01-28T00:06:30.284168780Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 28 00:06:30.287119 containerd[1660]: time="2026-01-28T00:06:30.287090748Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 28 00:06:30.287755 containerd[1660]: time="2026-01-28T00:06:30.287731470Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 581.234604ms" Jan 28 00:06:30.287814 containerd[1660]: time="2026-01-28T00:06:30.287759551Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Jan 28 00:06:30.288199 containerd[1660]: time="2026-01-28T00:06:30.288156272Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Jan 28 00:06:30.889868 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4163009682.mount: Deactivated successfully. Jan 28 00:06:32.620395 containerd[1660]: time="2026-01-28T00:06:32.620318990Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:32.622135 containerd[1660]: time="2026-01-28T00:06:32.622087755Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=57926377" Jan 28 00:06:32.623410 containerd[1660]: time="2026-01-28T00:06:32.623382039Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:32.629905 containerd[1660]: time="2026-01-28T00:06:32.629432698Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:32.630504 containerd[1660]: time="2026-01-28T00:06:32.630480301Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 2.342275989s" Jan 28 00:06:32.630587 containerd[1660]: time="2026-01-28T00:06:32.630572621Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Jan 28 00:06:32.836149 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 28 00:06:32.837548 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 00:06:32.961080 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 00:06:32.960000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:32.964390 kernel: kauditd_printk_skb: 134 callbacks suppressed Jan 28 00:06:32.964443 kernel: audit: type=1130 audit(1769558792.960:290): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:32.965098 (kubelet)[2393]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 28 00:06:32.996066 kubelet[2393]: E0128 00:06:32.995961 2393 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 28 00:06:32.998855 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 28 00:06:32.998985 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 28 00:06:33.000000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 28 00:06:33.001349 systemd[1]: kubelet.service: Consumed 133ms CPU time, 111.6M memory peak. Jan 28 00:06:33.004256 kernel: audit: type=1131 audit(1769558793.000:291): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 28 00:06:34.691336 update_engine[1643]: I20260128 00:06:34.691231 1643 update_attempter.cc:509] Updating boot flags... Jan 28 00:06:38.416802 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 00:06:38.416955 systemd[1]: kubelet.service: Consumed 133ms CPU time, 111.6M memory peak. Jan 28 00:06:38.415000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:38.421350 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 00:06:38.415000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:38.424334 kernel: audit: type=1130 audit(1769558798.415:292): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:38.424376 kernel: audit: type=1131 audit(1769558798.415:293): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:38.444720 systemd[1]: Reload requested from client PID 2440 ('systemctl') (unit session-8.scope)... Jan 28 00:06:38.444735 systemd[1]: Reloading... Jan 28 00:06:38.525300 zram_generator::config[2486]: No configuration found. Jan 28 00:06:38.695881 systemd[1]: Reloading finished in 250 ms. Jan 28 00:06:38.724226 kernel: audit: type=1334 audit(1769558798.718:294): prog-id=63 op=LOAD Jan 28 00:06:38.724316 kernel: audit: type=1334 audit(1769558798.718:295): prog-id=54 op=UNLOAD Jan 28 00:06:38.724333 kernel: audit: type=1334 audit(1769558798.718:296): prog-id=64 op=LOAD Jan 28 00:06:38.718000 audit: BPF prog-id=63 op=LOAD Jan 28 00:06:38.718000 audit: BPF prog-id=54 op=UNLOAD Jan 28 00:06:38.718000 audit: BPF prog-id=64 op=LOAD Jan 28 00:06:38.725284 kernel: audit: type=1334 audit(1769558798.718:297): prog-id=65 op=LOAD Jan 28 00:06:38.718000 audit: BPF prog-id=65 op=LOAD Jan 28 00:06:38.727781 kernel: audit: type=1334 audit(1769558798.718:298): prog-id=55 op=UNLOAD Jan 28 00:06:38.727848 kernel: audit: type=1334 audit(1769558798.718:299): prog-id=56 op=UNLOAD Jan 28 00:06:38.718000 audit: BPF prog-id=55 op=UNLOAD Jan 28 00:06:38.718000 audit: BPF prog-id=56 op=UNLOAD Jan 28 00:06:38.720000 audit: BPF prog-id=66 op=LOAD Jan 28 00:06:38.720000 audit: BPF prog-id=60 op=UNLOAD Jan 28 00:06:38.729926 kernel: audit: type=1334 audit(1769558798.720:300): prog-id=66 op=LOAD Jan 28 00:06:38.730006 kernel: audit: type=1334 audit(1769558798.720:301): prog-id=60 op=UNLOAD Jan 28 00:06:38.721000 audit: BPF prog-id=67 op=LOAD Jan 28 00:06:38.721000 audit: BPF prog-id=68 op=LOAD Jan 28 00:06:38.721000 audit: BPF prog-id=61 op=UNLOAD Jan 28 00:06:38.721000 audit: BPF prog-id=62 op=UNLOAD Jan 28 00:06:38.722000 audit: BPF prog-id=69 op=LOAD Jan 28 00:06:38.722000 audit: BPF prog-id=70 op=LOAD Jan 28 00:06:38.722000 audit: BPF prog-id=49 op=UNLOAD Jan 28 00:06:38.722000 audit: BPF prog-id=50 op=UNLOAD Jan 28 00:06:38.722000 audit: BPF prog-id=71 op=LOAD Jan 28 00:06:38.722000 audit: BPF prog-id=58 op=UNLOAD Jan 28 00:06:38.724000 audit: BPF prog-id=72 op=LOAD Jan 28 00:06:38.724000 audit: BPF prog-id=46 op=UNLOAD Jan 28 00:06:38.724000 audit: BPF prog-id=73 op=LOAD Jan 28 00:06:38.724000 audit: BPF prog-id=74 op=LOAD Jan 28 00:06:38.724000 audit: BPF prog-id=47 op=UNLOAD Jan 28 00:06:38.724000 audit: BPF prog-id=48 op=UNLOAD Jan 28 00:06:38.724000 audit: BPF prog-id=75 op=LOAD Jan 28 00:06:38.724000 audit: BPF prog-id=59 op=UNLOAD Jan 28 00:06:38.725000 audit: BPF prog-id=76 op=LOAD Jan 28 00:06:38.725000 audit: BPF prog-id=57 op=UNLOAD Jan 28 00:06:38.726000 audit: BPF prog-id=77 op=LOAD Jan 28 00:06:38.726000 audit: BPF prog-id=43 op=UNLOAD Jan 28 00:06:38.726000 audit: BPF prog-id=78 op=LOAD Jan 28 00:06:38.726000 audit: BPF prog-id=79 op=LOAD Jan 28 00:06:38.726000 audit: BPF prog-id=44 op=UNLOAD Jan 28 00:06:38.726000 audit: BPF prog-id=45 op=UNLOAD Jan 28 00:06:38.727000 audit: BPF prog-id=80 op=LOAD Jan 28 00:06:38.742000 audit: BPF prog-id=51 op=UNLOAD Jan 28 00:06:38.742000 audit: BPF prog-id=81 op=LOAD Jan 28 00:06:38.742000 audit: BPF prog-id=82 op=LOAD Jan 28 00:06:38.742000 audit: BPF prog-id=52 op=UNLOAD Jan 28 00:06:38.742000 audit: BPF prog-id=53 op=UNLOAD Jan 28 00:06:38.752000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:38.755675 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 00:06:38.758436 systemd[1]: kubelet.service: Deactivated successfully. Jan 28 00:06:38.758663 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 00:06:38.757000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:38.758717 systemd[1]: kubelet.service: Consumed 95ms CPU time, 95.4M memory peak. Jan 28 00:06:38.760114 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 00:06:38.905741 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 00:06:38.904000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:38.909386 (kubelet)[2536]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 28 00:06:38.941253 kubelet[2536]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 00:06:38.941253 kubelet[2536]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 28 00:06:38.941253 kubelet[2536]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 00:06:38.941556 kubelet[2536]: I0128 00:06:38.941290 2536 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 28 00:06:39.754337 kubelet[2536]: I0128 00:06:39.754270 2536 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 28 00:06:39.754337 kubelet[2536]: I0128 00:06:39.754301 2536 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 28 00:06:39.754615 kubelet[2536]: I0128 00:06:39.754508 2536 server.go:956] "Client rotation is on, will bootstrap in background" Jan 28 00:06:39.800735 kubelet[2536]: I0128 00:06:39.800695 2536 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 28 00:06:39.805374 kubelet[2536]: E0128 00:06:39.805335 2536 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.1.105:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.1.105:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jan 28 00:06:39.817638 kubelet[2536]: I0128 00:06:39.817609 2536 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 28 00:06:39.821909 kubelet[2536]: I0128 00:06:39.821880 2536 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 28 00:06:39.822266 kubelet[2536]: I0128 00:06:39.822223 2536 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 28 00:06:39.822404 kubelet[2536]: I0128 00:06:39.822250 2536 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4593-0-0-n-ea467cc685","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 28 00:06:39.822488 kubelet[2536]: I0128 00:06:39.822476 2536 topology_manager.go:138] "Creating topology manager with none policy" Jan 28 00:06:39.822488 kubelet[2536]: I0128 00:06:39.822485 2536 container_manager_linux.go:303] "Creating device plugin manager" Jan 28 00:06:39.822717 kubelet[2536]: I0128 00:06:39.822684 2536 state_mem.go:36] "Initialized new in-memory state store" Jan 28 00:06:39.828985 kubelet[2536]: I0128 00:06:39.828957 2536 kubelet.go:480] "Attempting to sync node with API server" Jan 28 00:06:39.829014 kubelet[2536]: I0128 00:06:39.829000 2536 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 28 00:06:39.829053 kubelet[2536]: I0128 00:06:39.829042 2536 kubelet.go:386] "Adding apiserver pod source" Jan 28 00:06:39.833392 kubelet[2536]: I0128 00:06:39.833362 2536 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 28 00:06:39.833422 kubelet[2536]: E0128 00:06:39.833392 2536 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.1.105:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4593-0-0-n-ea467cc685&limit=500&resourceVersion=0\": dial tcp 10.0.1.105:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 28 00:06:39.833896 kubelet[2536]: E0128 00:06:39.833858 2536 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.1.105:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.1.105:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 28 00:06:39.834473 kubelet[2536]: I0128 00:06:39.834440 2536 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 28 00:06:39.835159 kubelet[2536]: I0128 00:06:39.835126 2536 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 28 00:06:39.835299 kubelet[2536]: W0128 00:06:39.835271 2536 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 28 00:06:39.837504 kubelet[2536]: I0128 00:06:39.837466 2536 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 28 00:06:39.837551 kubelet[2536]: I0128 00:06:39.837510 2536 server.go:1289] "Started kubelet" Jan 28 00:06:39.838597 kubelet[2536]: I0128 00:06:39.838567 2536 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 28 00:06:39.842263 kubelet[2536]: I0128 00:06:39.842175 2536 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 28 00:06:39.842489 kubelet[2536]: I0128 00:06:39.842459 2536 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 28 00:06:39.842535 kubelet[2536]: I0128 00:06:39.842515 2536 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 28 00:06:39.842000 audit[2551]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2551 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:39.842000 audit[2551]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffe096b640 a2=0 a3=0 items=0 ppid=2536 pid=2551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:39.842000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 28 00:06:39.843756 kubelet[2536]: I0128 00:06:39.843548 2536 server.go:317] "Adding debug handlers to kubelet server" Jan 28 00:06:39.843000 audit[2553]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2553 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:39.843000 audit[2553]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd5e87de0 a2=0 a3=0 items=0 ppid=2536 pid=2553 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:39.843000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 28 00:06:39.845423 kubelet[2536]: E0128 00:06:39.844594 2536 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4593-0-0-n-ea467cc685\" not found" Jan 28 00:06:39.845423 kubelet[2536]: I0128 00:06:39.844671 2536 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 28 00:06:39.845423 kubelet[2536]: I0128 00:06:39.844880 2536 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 28 00:06:39.845423 kubelet[2536]: I0128 00:06:39.844968 2536 reconciler.go:26] "Reconciler: start to sync state" Jan 28 00:06:39.845423 kubelet[2536]: E0128 00:06:39.845318 2536 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.1.105:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.1.105:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 28 00:06:39.845830 kubelet[2536]: I0128 00:06:39.845780 2536 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 28 00:06:39.846620 kubelet[2536]: E0128 00:06:39.846587 2536 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 28 00:06:39.847506 kubelet[2536]: I0128 00:06:39.846738 2536 factory.go:223] Registration of the containerd container factory successfully Jan 28 00:06:39.847506 kubelet[2536]: I0128 00:06:39.846755 2536 factory.go:223] Registration of the systemd container factory successfully Jan 28 00:06:39.847506 kubelet[2536]: I0128 00:06:39.846967 2536 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 28 00:06:39.847506 kubelet[2536]: E0128 00:06:39.843598 2536 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.1.105:6443/api/v1/namespaces/default/events\": dial tcp 10.0.1.105:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4593-0-0-n-ea467cc685.188ebc5a597ace0b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4593-0-0-n-ea467cc685,UID:ci-4593-0-0-n-ea467cc685,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4593-0-0-n-ea467cc685,},FirstTimestamp:2026-01-28 00:06:39.837482507 +0000 UTC m=+0.924329968,LastTimestamp:2026-01-28 00:06:39.837482507 +0000 UTC m=+0.924329968,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4593-0-0-n-ea467cc685,}" Jan 28 00:06:39.848000 audit[2556]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2556 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:39.848000 audit[2556]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffdaa4ffd0 a2=0 a3=0 items=0 ppid=2536 pid=2556 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:39.848000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 28 00:06:39.850000 audit[2558]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2558 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:39.850000 audit[2558]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffc7adb1e0 a2=0 a3=0 items=0 ppid=2536 pid=2558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:39.850000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 28 00:06:39.855423 kubelet[2536]: E0128 00:06:39.855390 2536 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.1.105:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-n-ea467cc685?timeout=10s\": dial tcp 10.0.1.105:6443: connect: connection refused" interval="200ms" Jan 28 00:06:39.860000 audit[2564]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2564 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:39.860000 audit[2564]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=ffffd3b38160 a2=0 a3=0 items=0 ppid=2536 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:39.860000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 28 00:06:39.863008 kubelet[2536]: I0128 00:06:39.862976 2536 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 28 00:06:39.863621 kubelet[2536]: I0128 00:06:39.863548 2536 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 28 00:06:39.863908 kubelet[2536]: I0128 00:06:39.863755 2536 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 28 00:06:39.863908 kubelet[2536]: I0128 00:06:39.863778 2536 state_mem.go:36] "Initialized new in-memory state store" Jan 28 00:06:39.862000 audit[2567]: NETFILTER_CFG table=mangle:47 family=2 entries=1 op=nft_register_chain pid=2567 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:39.862000 audit[2567]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc5312350 a2=0 a3=0 items=0 ppid=2536 pid=2567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:39.862000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 28 00:06:39.862000 audit[2566]: NETFILTER_CFG table=mangle:48 family=10 entries=2 op=nft_register_chain pid=2566 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:39.862000 audit[2566]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffc4699f20 a2=0 a3=0 items=0 ppid=2536 pid=2566 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:39.862000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 28 00:06:39.864408 kubelet[2536]: I0128 00:06:39.864388 2536 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 28 00:06:39.864481 kubelet[2536]: I0128 00:06:39.864471 2536 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 28 00:06:39.864551 kubelet[2536]: I0128 00:06:39.864539 2536 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 28 00:06:39.864603 kubelet[2536]: I0128 00:06:39.864595 2536 kubelet.go:2436] "Starting kubelet main sync loop" Jan 28 00:06:39.864716 kubelet[2536]: E0128 00:06:39.864698 2536 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 28 00:06:39.863000 audit[2568]: NETFILTER_CFG table=nat:49 family=2 entries=1 op=nft_register_chain pid=2568 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:39.863000 audit[2568]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe39015a0 a2=0 a3=0 items=0 ppid=2536 pid=2568 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:39.863000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 28 00:06:39.864000 audit[2569]: NETFILTER_CFG table=mangle:50 family=10 entries=1 op=nft_register_chain pid=2569 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:39.864000 audit[2569]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffdef1b320 a2=0 a3=0 items=0 ppid=2536 pid=2569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:39.864000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 28 00:06:39.865896 kubelet[2536]: E0128 00:06:39.865764 2536 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.1.105:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.1.105:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 28 00:06:39.865000 audit[2570]: NETFILTER_CFG table=filter:51 family=2 entries=1 op=nft_register_chain pid=2570 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:39.865000 audit[2570]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffef1d6b70 a2=0 a3=0 items=0 ppid=2536 pid=2570 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:39.865000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 28 00:06:39.865000 audit[2571]: NETFILTER_CFG table=nat:52 family=10 entries=1 op=nft_register_chain pid=2571 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:39.865000 audit[2571]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffcc7c0280 a2=0 a3=0 items=0 ppid=2536 pid=2571 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:39.865000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 28 00:06:39.866000 audit[2572]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2572 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:39.866000 audit[2572]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc59e7320 a2=0 a3=0 items=0 ppid=2536 pid=2572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:39.866000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 28 00:06:39.868374 kubelet[2536]: I0128 00:06:39.868055 2536 policy_none.go:49] "None policy: Start" Jan 28 00:06:39.868374 kubelet[2536]: I0128 00:06:39.868074 2536 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 28 00:06:39.868374 kubelet[2536]: I0128 00:06:39.868083 2536 state_mem.go:35] "Initializing new in-memory state store" Jan 28 00:06:39.873684 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 28 00:06:39.894283 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 28 00:06:39.898402 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 28 00:06:39.909615 kubelet[2536]: E0128 00:06:39.909537 2536 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 28 00:06:39.909782 kubelet[2536]: I0128 00:06:39.909757 2536 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 28 00:06:39.909834 kubelet[2536]: I0128 00:06:39.909780 2536 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 28 00:06:39.910299 kubelet[2536]: I0128 00:06:39.910269 2536 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 28 00:06:39.912059 kubelet[2536]: E0128 00:06:39.911969 2536 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 28 00:06:39.912111 kubelet[2536]: E0128 00:06:39.912082 2536 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4593-0-0-n-ea467cc685\" not found" Jan 28 00:06:39.975411 systemd[1]: Created slice kubepods-burstable-pod2754debdbff30c1fe5fc954f9f9154cc.slice - libcontainer container kubepods-burstable-pod2754debdbff30c1fe5fc954f9f9154cc.slice. Jan 28 00:06:39.995776 kubelet[2536]: E0128 00:06:39.995752 2536 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-ea467cc685\" not found" node="ci-4593-0-0-n-ea467cc685" Jan 28 00:06:39.999057 systemd[1]: Created slice kubepods-burstable-pod3863bc8f487de77322695b91903e5bcc.slice - libcontainer container kubepods-burstable-pod3863bc8f487de77322695b91903e5bcc.slice. Jan 28 00:06:40.000448 kubelet[2536]: E0128 00:06:40.000429 2536 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-ea467cc685\" not found" node="ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.001723 systemd[1]: Created slice kubepods-burstable-pod6f8682244d88c049ee06bd0eb079b290.slice - libcontainer container kubepods-burstable-pod6f8682244d88c049ee06bd0eb079b290.slice. Jan 28 00:06:40.003032 kubelet[2536]: E0128 00:06:40.003014 2536 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-ea467cc685\" not found" node="ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.011765 kubelet[2536]: I0128 00:06:40.011680 2536 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.012784 kubelet[2536]: E0128 00:06:40.012754 2536 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.1.105:6443/api/v1/nodes\": dial tcp 10.0.1.105:6443: connect: connection refused" node="ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.056787 kubelet[2536]: E0128 00:06:40.056677 2536 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.1.105:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-n-ea467cc685?timeout=10s\": dial tcp 10.0.1.105:6443: connect: connection refused" interval="400ms" Jan 28 00:06:40.146465 kubelet[2536]: I0128 00:06:40.146358 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3863bc8f487de77322695b91903e5bcc-ca-certs\") pod \"kube-controller-manager-ci-4593-0-0-n-ea467cc685\" (UID: \"3863bc8f487de77322695b91903e5bcc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.146465 kubelet[2536]: I0128 00:06:40.146447 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3863bc8f487de77322695b91903e5bcc-k8s-certs\") pod \"kube-controller-manager-ci-4593-0-0-n-ea467cc685\" (UID: \"3863bc8f487de77322695b91903e5bcc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.146688 kubelet[2536]: I0128 00:06:40.146504 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3863bc8f487de77322695b91903e5bcc-kubeconfig\") pod \"kube-controller-manager-ci-4593-0-0-n-ea467cc685\" (UID: \"3863bc8f487de77322695b91903e5bcc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.146688 kubelet[2536]: I0128 00:06:40.146554 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3863bc8f487de77322695b91903e5bcc-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4593-0-0-n-ea467cc685\" (UID: \"3863bc8f487de77322695b91903e5bcc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.146688 kubelet[2536]: I0128 00:06:40.146584 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6f8682244d88c049ee06bd0eb079b290-kubeconfig\") pod \"kube-scheduler-ci-4593-0-0-n-ea467cc685\" (UID: \"6f8682244d88c049ee06bd0eb079b290\") " pod="kube-system/kube-scheduler-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.146688 kubelet[2536]: I0128 00:06:40.146597 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2754debdbff30c1fe5fc954f9f9154cc-ca-certs\") pod \"kube-apiserver-ci-4593-0-0-n-ea467cc685\" (UID: \"2754debdbff30c1fe5fc954f9f9154cc\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.146688 kubelet[2536]: I0128 00:06:40.146611 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2754debdbff30c1fe5fc954f9f9154cc-k8s-certs\") pod \"kube-apiserver-ci-4593-0-0-n-ea467cc685\" (UID: \"2754debdbff30c1fe5fc954f9f9154cc\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.146914 kubelet[2536]: I0128 00:06:40.146626 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2754debdbff30c1fe5fc954f9f9154cc-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4593-0-0-n-ea467cc685\" (UID: \"2754debdbff30c1fe5fc954f9f9154cc\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.146914 kubelet[2536]: I0128 00:06:40.146640 2536 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3863bc8f487de77322695b91903e5bcc-flexvolume-dir\") pod \"kube-controller-manager-ci-4593-0-0-n-ea467cc685\" (UID: \"3863bc8f487de77322695b91903e5bcc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.215079 kubelet[2536]: I0128 00:06:40.215043 2536 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.215382 kubelet[2536]: E0128 00:06:40.215346 2536 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.1.105:6443/api/v1/nodes\": dial tcp 10.0.1.105:6443: connect: connection refused" node="ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.297924 containerd[1660]: time="2026-01-28T00:06:40.297831386Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4593-0-0-n-ea467cc685,Uid:2754debdbff30c1fe5fc954f9f9154cc,Namespace:kube-system,Attempt:0,}" Jan 28 00:06:40.301113 containerd[1660]: time="2026-01-28T00:06:40.300984035Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4593-0-0-n-ea467cc685,Uid:3863bc8f487de77322695b91903e5bcc,Namespace:kube-system,Attempt:0,}" Jan 28 00:06:40.304083 containerd[1660]: time="2026-01-28T00:06:40.304057965Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4593-0-0-n-ea467cc685,Uid:6f8682244d88c049ee06bd0eb079b290,Namespace:kube-system,Attempt:0,}" Jan 28 00:06:40.338292 containerd[1660]: time="2026-01-28T00:06:40.338242428Z" level=info msg="connecting to shim 79fba41990da084ef21f92f50171b561717594b6c8b2955ccbc0145ea12dc7c0" address="unix:///run/containerd/s/138613617339702a32d9deb6616d3fcc32c893bd7c3942ffadd158fa7d13cc48" namespace=k8s.io protocol=ttrpc version=3 Jan 28 00:06:40.347523 containerd[1660]: time="2026-01-28T00:06:40.347403056Z" level=info msg="connecting to shim 1de56ca7c1e5ecf65dda0dd5d85b05a2a753953b9bee0e08388d25c3bd852ac8" address="unix:///run/containerd/s/9dccda6cf65513b940c6630d7e393b40909e3da0bd64847a995e591f9552437b" namespace=k8s.io protocol=ttrpc version=3 Jan 28 00:06:40.364408 containerd[1660]: time="2026-01-28T00:06:40.364367188Z" level=info msg="connecting to shim 9bf72fee7415b41755d40f1116e16459e124be1cfb644fb896e611c9271901d9" address="unix:///run/containerd/s/91677684d8a5e4b17a43df77cdff6526bc5ce26939c58e75069a59796038a6e0" namespace=k8s.io protocol=ttrpc version=3 Jan 28 00:06:40.369464 systemd[1]: Started cri-containerd-79fba41990da084ef21f92f50171b561717594b6c8b2955ccbc0145ea12dc7c0.scope - libcontainer container 79fba41990da084ef21f92f50171b561717594b6c8b2955ccbc0145ea12dc7c0. Jan 28 00:06:40.372436 systemd[1]: Started cri-containerd-1de56ca7c1e5ecf65dda0dd5d85b05a2a753953b9bee0e08388d25c3bd852ac8.scope - libcontainer container 1de56ca7c1e5ecf65dda0dd5d85b05a2a753953b9bee0e08388d25c3bd852ac8. Jan 28 00:06:40.381000 audit: BPF prog-id=83 op=LOAD Jan 28 00:06:40.381000 audit: BPF prog-id=84 op=LOAD Jan 28 00:06:40.381000 audit[2602]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2582 pid=2602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.381000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739666261343139393064613038346566323166393266353031373162 Jan 28 00:06:40.382000 audit: BPF prog-id=84 op=UNLOAD Jan 28 00:06:40.382000 audit[2602]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.382000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739666261343139393064613038346566323166393266353031373162 Jan 28 00:06:40.382000 audit: BPF prog-id=85 op=LOAD Jan 28 00:06:40.382000 audit[2602]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2582 pid=2602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.382000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739666261343139393064613038346566323166393266353031373162 Jan 28 00:06:40.382000 audit: BPF prog-id=86 op=LOAD Jan 28 00:06:40.382000 audit[2602]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2582 pid=2602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.382000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739666261343139393064613038346566323166393266353031373162 Jan 28 00:06:40.382000 audit: BPF prog-id=86 op=UNLOAD Jan 28 00:06:40.382000 audit[2602]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.382000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739666261343139393064613038346566323166393266353031373162 Jan 28 00:06:40.382000 audit: BPF prog-id=85 op=UNLOAD Jan 28 00:06:40.382000 audit[2602]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.382000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739666261343139393064613038346566323166393266353031373162 Jan 28 00:06:40.382000 audit: BPF prog-id=87 op=LOAD Jan 28 00:06:40.382000 audit[2602]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2582 pid=2602 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.382000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739666261343139393064613038346566323166393266353031373162 Jan 28 00:06:40.385000 audit: BPF prog-id=88 op=LOAD Jan 28 00:06:40.385000 audit: BPF prog-id=89 op=LOAD Jan 28 00:06:40.385000 audit[2618]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=2600 pid=2618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.385000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164653536636137633165356563663635646461306464356438356230 Jan 28 00:06:40.385000 audit: BPF prog-id=89 op=UNLOAD Jan 28 00:06:40.385000 audit[2618]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2600 pid=2618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.385000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164653536636137633165356563663635646461306464356438356230 Jan 28 00:06:40.385000 audit: BPF prog-id=90 op=LOAD Jan 28 00:06:40.385000 audit[2618]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=2600 pid=2618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.385000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164653536636137633165356563663635646461306464356438356230 Jan 28 00:06:40.385000 audit: BPF prog-id=91 op=LOAD Jan 28 00:06:40.385000 audit[2618]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=2600 pid=2618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.385000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164653536636137633165356563663635646461306464356438356230 Jan 28 00:06:40.385000 audit: BPF prog-id=91 op=UNLOAD Jan 28 00:06:40.385000 audit[2618]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2600 pid=2618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.385000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164653536636137633165356563663635646461306464356438356230 Jan 28 00:06:40.385000 audit: BPF prog-id=90 op=UNLOAD Jan 28 00:06:40.385000 audit[2618]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2600 pid=2618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.385000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164653536636137633165356563663635646461306464356438356230 Jan 28 00:06:40.385000 audit: BPF prog-id=92 op=LOAD Jan 28 00:06:40.385000 audit[2618]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=2600 pid=2618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.385000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164653536636137633165356563663635646461306464356438356230 Jan 28 00:06:40.397652 systemd[1]: Started cri-containerd-9bf72fee7415b41755d40f1116e16459e124be1cfb644fb896e611c9271901d9.scope - libcontainer container 9bf72fee7415b41755d40f1116e16459e124be1cfb644fb896e611c9271901d9. Jan 28 00:06:40.411664 containerd[1660]: time="2026-01-28T00:06:40.411606571Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4593-0-0-n-ea467cc685,Uid:2754debdbff30c1fe5fc954f9f9154cc,Namespace:kube-system,Attempt:0,} returns sandbox id \"1de56ca7c1e5ecf65dda0dd5d85b05a2a753953b9bee0e08388d25c3bd852ac8\"" Jan 28 00:06:40.413000 audit: BPF prog-id=93 op=LOAD Jan 28 00:06:40.414000 audit: BPF prog-id=94 op=LOAD Jan 28 00:06:40.414000 audit[2659]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2645 pid=2659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.414000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962663732666565373431356234313735356434306631313136653136 Jan 28 00:06:40.414000 audit: BPF prog-id=94 op=UNLOAD Jan 28 00:06:40.414000 audit[2659]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.414000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962663732666565373431356234313735356434306631313136653136 Jan 28 00:06:40.414000 audit: BPF prog-id=95 op=LOAD Jan 28 00:06:40.414000 audit[2659]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2645 pid=2659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.414000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962663732666565373431356234313735356434306631313136653136 Jan 28 00:06:40.414000 audit: BPF prog-id=96 op=LOAD Jan 28 00:06:40.414000 audit[2659]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2645 pid=2659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.414000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962663732666565373431356234313735356434306631313136653136 Jan 28 00:06:40.414000 audit: BPF prog-id=96 op=UNLOAD Jan 28 00:06:40.414000 audit[2659]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.414000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962663732666565373431356234313735356434306631313136653136 Jan 28 00:06:40.414000 audit: BPF prog-id=95 op=UNLOAD Jan 28 00:06:40.414000 audit[2659]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.414000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962663732666565373431356234313735356434306631313136653136 Jan 28 00:06:40.414000 audit: BPF prog-id=97 op=LOAD Jan 28 00:06:40.414000 audit[2659]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2645 pid=2659 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.414000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962663732666565373431356234313735356434306631313136653136 Jan 28 00:06:40.418436 containerd[1660]: time="2026-01-28T00:06:40.418405832Z" level=info msg="CreateContainer within sandbox \"1de56ca7c1e5ecf65dda0dd5d85b05a2a753953b9bee0e08388d25c3bd852ac8\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 28 00:06:40.425533 containerd[1660]: time="2026-01-28T00:06:40.425492173Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4593-0-0-n-ea467cc685,Uid:3863bc8f487de77322695b91903e5bcc,Namespace:kube-system,Attempt:0,} returns sandbox id \"79fba41990da084ef21f92f50171b561717594b6c8b2955ccbc0145ea12dc7c0\"" Jan 28 00:06:40.433045 containerd[1660]: time="2026-01-28T00:06:40.433012916Z" level=info msg="CreateContainer within sandbox \"79fba41990da084ef21f92f50171b561717594b6c8b2955ccbc0145ea12dc7c0\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 28 00:06:40.435153 containerd[1660]: time="2026-01-28T00:06:40.435095963Z" level=info msg="Container 618c7a88dad6cfe173593b11eec5b861a7be5db81b82aa2e52f713d140a21b5e: CDI devices from CRI Config.CDIDevices: []" Jan 28 00:06:40.445605 containerd[1660]: time="2026-01-28T00:06:40.445562394Z" level=info msg="CreateContainer within sandbox \"1de56ca7c1e5ecf65dda0dd5d85b05a2a753953b9bee0e08388d25c3bd852ac8\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"618c7a88dad6cfe173593b11eec5b861a7be5db81b82aa2e52f713d140a21b5e\"" Jan 28 00:06:40.446689 containerd[1660]: time="2026-01-28T00:06:40.446660158Z" level=info msg="StartContainer for \"618c7a88dad6cfe173593b11eec5b861a7be5db81b82aa2e52f713d140a21b5e\"" Jan 28 00:06:40.447936 containerd[1660]: time="2026-01-28T00:06:40.447852481Z" level=info msg="connecting to shim 618c7a88dad6cfe173593b11eec5b861a7be5db81b82aa2e52f713d140a21b5e" address="unix:///run/containerd/s/9dccda6cf65513b940c6630d7e393b40909e3da0bd64847a995e591f9552437b" protocol=ttrpc version=3 Jan 28 00:06:40.449244 containerd[1660]: time="2026-01-28T00:06:40.449219925Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4593-0-0-n-ea467cc685,Uid:6f8682244d88c049ee06bd0eb079b290,Namespace:kube-system,Attempt:0,} returns sandbox id \"9bf72fee7415b41755d40f1116e16459e124be1cfb644fb896e611c9271901d9\"" Jan 28 00:06:40.453624 containerd[1660]: time="2026-01-28T00:06:40.453591099Z" level=info msg="CreateContainer within sandbox \"9bf72fee7415b41755d40f1116e16459e124be1cfb644fb896e611c9271901d9\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 28 00:06:40.456265 containerd[1660]: time="2026-01-28T00:06:40.455390744Z" level=info msg="Container 87d87ae4e516d477ff4465663aed43c7516fb7ff4c5a78513ce6937f1ebe6ae7: CDI devices from CRI Config.CDIDevices: []" Jan 28 00:06:40.458133 kubelet[2536]: E0128 00:06:40.458090 2536 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.1.105:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4593-0-0-n-ea467cc685?timeout=10s\": dial tcp 10.0.1.105:6443: connect: connection refused" interval="800ms" Jan 28 00:06:40.466396 containerd[1660]: time="2026-01-28T00:06:40.466365337Z" level=info msg="CreateContainer within sandbox \"79fba41990da084ef21f92f50171b561717594b6c8b2955ccbc0145ea12dc7c0\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"87d87ae4e516d477ff4465663aed43c7516fb7ff4c5a78513ce6937f1ebe6ae7\"" Jan 28 00:06:40.466757 containerd[1660]: time="2026-01-28T00:06:40.466735419Z" level=info msg="StartContainer for \"87d87ae4e516d477ff4465663aed43c7516fb7ff4c5a78513ce6937f1ebe6ae7\"" Jan 28 00:06:40.467369 systemd[1]: Started cri-containerd-618c7a88dad6cfe173593b11eec5b861a7be5db81b82aa2e52f713d140a21b5e.scope - libcontainer container 618c7a88dad6cfe173593b11eec5b861a7be5db81b82aa2e52f713d140a21b5e. Jan 28 00:06:40.467936 containerd[1660]: time="2026-01-28T00:06:40.467714782Z" level=info msg="connecting to shim 87d87ae4e516d477ff4465663aed43c7516fb7ff4c5a78513ce6937f1ebe6ae7" address="unix:///run/containerd/s/138613617339702a32d9deb6616d3fcc32c893bd7c3942ffadd158fa7d13cc48" protocol=ttrpc version=3 Jan 28 00:06:40.475520 containerd[1660]: time="2026-01-28T00:06:40.475476725Z" level=info msg="Container 301912f0d665aeef89e8075edd528c6623895d2c5a8e6d210a27f55d8c23bb61: CDI devices from CRI Config.CDIDevices: []" Jan 28 00:06:40.479000 audit: BPF prog-id=98 op=LOAD Jan 28 00:06:40.479000 audit: BPF prog-id=99 op=LOAD Jan 28 00:06:40.479000 audit[2709]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138180 a2=98 a3=0 items=0 ppid=2600 pid=2709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.479000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3631386337613838646164366366653137333539336231316565633562 Jan 28 00:06:40.480000 audit: BPF prog-id=99 op=UNLOAD Jan 28 00:06:40.480000 audit[2709]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2600 pid=2709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.480000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3631386337613838646164366366653137333539336231316565633562 Jan 28 00:06:40.480000 audit: BPF prog-id=100 op=LOAD Jan 28 00:06:40.480000 audit[2709]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001383e8 a2=98 a3=0 items=0 ppid=2600 pid=2709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.480000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3631386337613838646164366366653137333539336231316565633562 Jan 28 00:06:40.481000 audit: BPF prog-id=101 op=LOAD Jan 28 00:06:40.481000 audit[2709]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000138168 a2=98 a3=0 items=0 ppid=2600 pid=2709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3631386337613838646164366366653137333539336231316565633562 Jan 28 00:06:40.481000 audit: BPF prog-id=101 op=UNLOAD Jan 28 00:06:40.481000 audit[2709]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2600 pid=2709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3631386337613838646164366366653137333539336231316565633562 Jan 28 00:06:40.481000 audit: BPF prog-id=100 op=UNLOAD Jan 28 00:06:40.481000 audit[2709]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2600 pid=2709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3631386337613838646164366366653137333539336231316565633562 Jan 28 00:06:40.481000 audit: BPF prog-id=102 op=LOAD Jan 28 00:06:40.481000 audit[2709]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138648 a2=98 a3=0 items=0 ppid=2600 pid=2709 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3631386337613838646164366366653137333539336231316565633562 Jan 28 00:06:40.485734 containerd[1660]: time="2026-01-28T00:06:40.485695836Z" level=info msg="CreateContainer within sandbox \"9bf72fee7415b41755d40f1116e16459e124be1cfb644fb896e611c9271901d9\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"301912f0d665aeef89e8075edd528c6623895d2c5a8e6d210a27f55d8c23bb61\"" Jan 28 00:06:40.486279 containerd[1660]: time="2026-01-28T00:06:40.486254078Z" level=info msg="StartContainer for \"301912f0d665aeef89e8075edd528c6623895d2c5a8e6d210a27f55d8c23bb61\"" Jan 28 00:06:40.487371 containerd[1660]: time="2026-01-28T00:06:40.487338281Z" level=info msg="connecting to shim 301912f0d665aeef89e8075edd528c6623895d2c5a8e6d210a27f55d8c23bb61" address="unix:///run/containerd/s/91677684d8a5e4b17a43df77cdff6526bc5ce26939c58e75069a59796038a6e0" protocol=ttrpc version=3 Jan 28 00:06:40.489396 systemd[1]: Started cri-containerd-87d87ae4e516d477ff4465663aed43c7516fb7ff4c5a78513ce6937f1ebe6ae7.scope - libcontainer container 87d87ae4e516d477ff4465663aed43c7516fb7ff4c5a78513ce6937f1ebe6ae7. Jan 28 00:06:40.507397 systemd[1]: Started cri-containerd-301912f0d665aeef89e8075edd528c6623895d2c5a8e6d210a27f55d8c23bb61.scope - libcontainer container 301912f0d665aeef89e8075edd528c6623895d2c5a8e6d210a27f55d8c23bb61. Jan 28 00:06:40.509000 audit: BPF prog-id=103 op=LOAD Jan 28 00:06:40.509000 audit: BPF prog-id=104 op=LOAD Jan 28 00:06:40.509000 audit[2725]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2582 pid=2725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.509000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3837643837616534653531366434373766663434363536363361656434 Jan 28 00:06:40.510000 audit: BPF prog-id=104 op=UNLOAD Jan 28 00:06:40.510000 audit[2725]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.510000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3837643837616534653531366434373766663434363536363361656434 Jan 28 00:06:40.510000 audit: BPF prog-id=105 op=LOAD Jan 28 00:06:40.510000 audit[2725]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2582 pid=2725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.510000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3837643837616534653531366434373766663434363536363361656434 Jan 28 00:06:40.510000 audit: BPF prog-id=106 op=LOAD Jan 28 00:06:40.510000 audit[2725]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2582 pid=2725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.510000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3837643837616534653531366434373766663434363536363361656434 Jan 28 00:06:40.511000 audit: BPF prog-id=106 op=UNLOAD Jan 28 00:06:40.511000 audit[2725]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.511000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3837643837616534653531366434373766663434363536363361656434 Jan 28 00:06:40.511000 audit: BPF prog-id=105 op=UNLOAD Jan 28 00:06:40.511000 audit[2725]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2582 pid=2725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.511000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3837643837616534653531366434373766663434363536363361656434 Jan 28 00:06:40.512000 audit: BPF prog-id=107 op=LOAD Jan 28 00:06:40.512000 audit[2725]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2582 pid=2725 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.512000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3837643837616534653531366434373766663434363536363361656434 Jan 28 00:06:40.517524 containerd[1660]: time="2026-01-28T00:06:40.517447733Z" level=info msg="StartContainer for \"618c7a88dad6cfe173593b11eec5b861a7be5db81b82aa2e52f713d140a21b5e\" returns successfully" Jan 28 00:06:40.523000 audit: BPF prog-id=108 op=LOAD Jan 28 00:06:40.524000 audit: BPF prog-id=109 op=LOAD Jan 28 00:06:40.524000 audit[2743]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2645 pid=2743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330313931326630643636356165656638396538303735656464353238 Jan 28 00:06:40.524000 audit: BPF prog-id=109 op=UNLOAD Jan 28 00:06:40.524000 audit[2743]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330313931326630643636356165656638396538303735656464353238 Jan 28 00:06:40.524000 audit: BPF prog-id=110 op=LOAD Jan 28 00:06:40.524000 audit[2743]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2645 pid=2743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330313931326630643636356165656638396538303735656464353238 Jan 28 00:06:40.524000 audit: BPF prog-id=111 op=LOAD Jan 28 00:06:40.524000 audit[2743]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2645 pid=2743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330313931326630643636356165656638396538303735656464353238 Jan 28 00:06:40.524000 audit: BPF prog-id=111 op=UNLOAD Jan 28 00:06:40.524000 audit[2743]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330313931326630643636356165656638396538303735656464353238 Jan 28 00:06:40.524000 audit: BPF prog-id=110 op=UNLOAD Jan 28 00:06:40.524000 audit[2743]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2645 pid=2743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330313931326630643636356165656638396538303735656464353238 Jan 28 00:06:40.524000 audit: BPF prog-id=112 op=LOAD Jan 28 00:06:40.524000 audit[2743]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2645 pid=2743 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:40.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330313931326630643636356165656638396538303735656464353238 Jan 28 00:06:40.551620 containerd[1660]: time="2026-01-28T00:06:40.551419996Z" level=info msg="StartContainer for \"87d87ae4e516d477ff4465663aed43c7516fb7ff4c5a78513ce6937f1ebe6ae7\" returns successfully" Jan 28 00:06:40.555294 containerd[1660]: time="2026-01-28T00:06:40.555259487Z" level=info msg="StartContainer for \"301912f0d665aeef89e8075edd528c6623895d2c5a8e6d210a27f55d8c23bb61\" returns successfully" Jan 28 00:06:40.618327 kubelet[2536]: I0128 00:06:40.618282 2536 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.871035 kubelet[2536]: E0128 00:06:40.870998 2536 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-ea467cc685\" not found" node="ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.875259 kubelet[2536]: E0128 00:06:40.874777 2536 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-ea467cc685\" not found" node="ci-4593-0-0-n-ea467cc685" Jan 28 00:06:40.876054 kubelet[2536]: E0128 00:06:40.876034 2536 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-ea467cc685\" not found" node="ci-4593-0-0-n-ea467cc685" Jan 28 00:06:41.878934 kubelet[2536]: E0128 00:06:41.878750 2536 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-ea467cc685\" not found" node="ci-4593-0-0-n-ea467cc685" Jan 28 00:06:41.880560 kubelet[2536]: E0128 00:06:41.880369 2536 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4593-0-0-n-ea467cc685\" not found" node="ci-4593-0-0-n-ea467cc685" Jan 28 00:06:42.104166 kubelet[2536]: I0128 00:06:42.104018 2536 kubelet_node_status.go:78] "Successfully registered node" node="ci-4593-0-0-n-ea467cc685" Jan 28 00:06:42.104166 kubelet[2536]: E0128 00:06:42.104053 2536 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4593-0-0-n-ea467cc685\": node \"ci-4593-0-0-n-ea467cc685\" not found" Jan 28 00:06:42.151288 kubelet[2536]: I0128 00:06:42.151065 2536 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:42.160603 kubelet[2536]: E0128 00:06:42.160402 2536 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4593-0-0-n-ea467cc685\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:42.160603 kubelet[2536]: I0128 00:06:42.160431 2536 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:42.162460 kubelet[2536]: E0128 00:06:42.162438 2536 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4593-0-0-n-ea467cc685\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:42.162635 kubelet[2536]: I0128 00:06:42.162519 2536 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:42.164022 kubelet[2536]: E0128 00:06:42.163998 2536 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4593-0-0-n-ea467cc685\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:42.835233 kubelet[2536]: I0128 00:06:42.834987 2536 apiserver.go:52] "Watching apiserver" Jan 28 00:06:42.845876 kubelet[2536]: I0128 00:06:42.845840 2536 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 28 00:06:42.878101 kubelet[2536]: I0128 00:06:42.877877 2536 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:42.880545 kubelet[2536]: E0128 00:06:42.880516 2536 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4593-0-0-n-ea467cc685\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:43.380135 kubelet[2536]: I0128 00:06:43.380104 2536 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:44.366243 systemd[1]: Reload requested from client PID 2820 ('systemctl') (unit session-8.scope)... Jan 28 00:06:44.366257 systemd[1]: Reloading... Jan 28 00:06:44.432365 zram_generator::config[2869]: No configuration found. Jan 28 00:06:44.607319 systemd[1]: Reloading finished in 240 ms. Jan 28 00:06:44.630105 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 00:06:44.647102 systemd[1]: kubelet.service: Deactivated successfully. Jan 28 00:06:44.647389 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 00:06:44.646000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:44.650455 kernel: kauditd_printk_skb: 203 callbacks suppressed Jan 28 00:06:44.650513 kernel: audit: type=1131 audit(1769558804.646:397): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:44.650517 systemd[1]: kubelet.service: Consumed 1.277s CPU time, 128.9M memory peak. Jan 28 00:06:44.652285 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 28 00:06:44.652000 audit: BPF prog-id=113 op=LOAD Jan 28 00:06:44.652000 audit: BPF prog-id=75 op=UNLOAD Jan 28 00:06:44.655873 kernel: audit: type=1334 audit(1769558804.652:398): prog-id=113 op=LOAD Jan 28 00:06:44.655953 kernel: audit: type=1334 audit(1769558804.652:399): prog-id=75 op=UNLOAD Jan 28 00:06:44.657244 kernel: audit: type=1334 audit(1769558804.654:400): prog-id=114 op=LOAD Jan 28 00:06:44.657313 kernel: audit: type=1334 audit(1769558804.654:401): prog-id=71 op=UNLOAD Jan 28 00:06:44.654000 audit: BPF prog-id=114 op=LOAD Jan 28 00:06:44.654000 audit: BPF prog-id=71 op=UNLOAD Jan 28 00:06:44.657603 kernel: audit: type=1334 audit(1769558804.655:402): prog-id=115 op=LOAD Jan 28 00:06:44.655000 audit: BPF prog-id=115 op=LOAD Jan 28 00:06:44.680000 audit: BPF prog-id=72 op=UNLOAD Jan 28 00:06:44.680000 audit: BPF prog-id=116 op=LOAD Jan 28 00:06:44.683272 kernel: audit: type=1334 audit(1769558804.680:403): prog-id=72 op=UNLOAD Jan 28 00:06:44.683321 kernel: audit: type=1334 audit(1769558804.680:404): prog-id=116 op=LOAD Jan 28 00:06:44.683339 kernel: audit: type=1334 audit(1769558804.681:405): prog-id=117 op=LOAD Jan 28 00:06:44.681000 audit: BPF prog-id=117 op=LOAD Jan 28 00:06:44.681000 audit: BPF prog-id=73 op=UNLOAD Jan 28 00:06:44.684869 kernel: audit: type=1334 audit(1769558804.681:406): prog-id=73 op=UNLOAD Jan 28 00:06:44.681000 audit: BPF prog-id=74 op=UNLOAD Jan 28 00:06:44.682000 audit: BPF prog-id=118 op=LOAD Jan 28 00:06:44.682000 audit: BPF prog-id=77 op=UNLOAD Jan 28 00:06:44.682000 audit: BPF prog-id=119 op=LOAD Jan 28 00:06:44.683000 audit: BPF prog-id=120 op=LOAD Jan 28 00:06:44.683000 audit: BPF prog-id=78 op=UNLOAD Jan 28 00:06:44.683000 audit: BPF prog-id=79 op=UNLOAD Jan 28 00:06:44.684000 audit: BPF prog-id=121 op=LOAD Jan 28 00:06:44.684000 audit: BPF prog-id=122 op=LOAD Jan 28 00:06:44.684000 audit: BPF prog-id=69 op=UNLOAD Jan 28 00:06:44.684000 audit: BPF prog-id=70 op=UNLOAD Jan 28 00:06:44.685000 audit: BPF prog-id=123 op=LOAD Jan 28 00:06:44.685000 audit: BPF prog-id=63 op=UNLOAD Jan 28 00:06:44.685000 audit: BPF prog-id=124 op=LOAD Jan 28 00:06:44.685000 audit: BPF prog-id=125 op=LOAD Jan 28 00:06:44.685000 audit: BPF prog-id=64 op=UNLOAD Jan 28 00:06:44.685000 audit: BPF prog-id=65 op=UNLOAD Jan 28 00:06:44.686000 audit: BPF prog-id=126 op=LOAD Jan 28 00:06:44.686000 audit: BPF prog-id=66 op=UNLOAD Jan 28 00:06:44.686000 audit: BPF prog-id=127 op=LOAD Jan 28 00:06:44.686000 audit: BPF prog-id=128 op=LOAD Jan 28 00:06:44.686000 audit: BPF prog-id=67 op=UNLOAD Jan 28 00:06:44.686000 audit: BPF prog-id=68 op=UNLOAD Jan 28 00:06:44.687000 audit: BPF prog-id=129 op=LOAD Jan 28 00:06:44.687000 audit: BPF prog-id=80 op=UNLOAD Jan 28 00:06:44.687000 audit: BPF prog-id=130 op=LOAD Jan 28 00:06:44.687000 audit: BPF prog-id=131 op=LOAD Jan 28 00:06:44.687000 audit: BPF prog-id=81 op=UNLOAD Jan 28 00:06:44.687000 audit: BPF prog-id=82 op=UNLOAD Jan 28 00:06:44.688000 audit: BPF prog-id=132 op=LOAD Jan 28 00:06:44.688000 audit: BPF prog-id=76 op=UNLOAD Jan 28 00:06:44.833425 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 28 00:06:44.832000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:44.837580 (kubelet)[2911]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 28 00:06:44.868750 kubelet[2911]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 00:06:44.868750 kubelet[2911]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 28 00:06:44.868750 kubelet[2911]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 00:06:44.869069 kubelet[2911]: I0128 00:06:44.868820 2911 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 28 00:06:44.876264 kubelet[2911]: I0128 00:06:44.876226 2911 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 28 00:06:44.876264 kubelet[2911]: I0128 00:06:44.876251 2911 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 28 00:06:44.876447 kubelet[2911]: I0128 00:06:44.876437 2911 server.go:956] "Client rotation is on, will bootstrap in background" Jan 28 00:06:44.877735 kubelet[2911]: I0128 00:06:44.877716 2911 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jan 28 00:06:44.879891 kubelet[2911]: I0128 00:06:44.879858 2911 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 28 00:06:44.884800 kubelet[2911]: I0128 00:06:44.884705 2911 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 28 00:06:44.888065 kubelet[2911]: I0128 00:06:44.888025 2911 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 28 00:06:44.888358 kubelet[2911]: I0128 00:06:44.888322 2911 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 28 00:06:44.888538 kubelet[2911]: I0128 00:06:44.888351 2911 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4593-0-0-n-ea467cc685","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 28 00:06:44.888641 kubelet[2911]: I0128 00:06:44.888548 2911 topology_manager.go:138] "Creating topology manager with none policy" Jan 28 00:06:44.888641 kubelet[2911]: I0128 00:06:44.888558 2911 container_manager_linux.go:303] "Creating device plugin manager" Jan 28 00:06:44.888641 kubelet[2911]: I0128 00:06:44.888617 2911 state_mem.go:36] "Initialized new in-memory state store" Jan 28 00:06:44.889716 kubelet[2911]: I0128 00:06:44.888826 2911 kubelet.go:480] "Attempting to sync node with API server" Jan 28 00:06:44.889716 kubelet[2911]: I0128 00:06:44.888854 2911 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 28 00:06:44.889716 kubelet[2911]: I0128 00:06:44.888879 2911 kubelet.go:386] "Adding apiserver pod source" Jan 28 00:06:44.889716 kubelet[2911]: I0128 00:06:44.888942 2911 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 28 00:06:44.890532 kubelet[2911]: I0128 00:06:44.890503 2911 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 28 00:06:44.891930 kubelet[2911]: I0128 00:06:44.891902 2911 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 28 00:06:44.898628 kubelet[2911]: I0128 00:06:44.898345 2911 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 28 00:06:44.898628 kubelet[2911]: I0128 00:06:44.898395 2911 server.go:1289] "Started kubelet" Jan 28 00:06:44.902485 kubelet[2911]: I0128 00:06:44.902460 2911 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 28 00:06:44.902980 kubelet[2911]: I0128 00:06:44.902935 2911 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 28 00:06:44.904110 kubelet[2911]: I0128 00:06:44.904089 2911 server.go:317] "Adding debug handlers to kubelet server" Jan 28 00:06:44.907003 kubelet[2911]: I0128 00:06:44.906958 2911 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 28 00:06:44.908844 kubelet[2911]: I0128 00:06:44.908812 2911 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 28 00:06:44.909103 kubelet[2911]: I0128 00:06:44.909082 2911 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 28 00:06:44.912483 kubelet[2911]: I0128 00:06:44.912455 2911 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 28 00:06:44.913252 kubelet[2911]: E0128 00:06:44.913225 2911 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4593-0-0-n-ea467cc685\" not found" Jan 28 00:06:44.913919 kubelet[2911]: I0128 00:06:44.913898 2911 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 28 00:06:44.914029 kubelet[2911]: I0128 00:06:44.914015 2911 reconciler.go:26] "Reconciler: start to sync state" Jan 28 00:06:44.915270 kubelet[2911]: E0128 00:06:44.915201 2911 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 28 00:06:44.915569 kubelet[2911]: I0128 00:06:44.915542 2911 factory.go:223] Registration of the systemd container factory successfully Jan 28 00:06:44.915679 kubelet[2911]: I0128 00:06:44.915655 2911 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 28 00:06:44.919001 kubelet[2911]: I0128 00:06:44.918974 2911 factory.go:223] Registration of the containerd container factory successfully Jan 28 00:06:44.924796 kubelet[2911]: I0128 00:06:44.924751 2911 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 28 00:06:44.927483 kubelet[2911]: I0128 00:06:44.927455 2911 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 28 00:06:44.927483 kubelet[2911]: I0128 00:06:44.927488 2911 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 28 00:06:44.927576 kubelet[2911]: I0128 00:06:44.927507 2911 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 28 00:06:44.927576 kubelet[2911]: I0128 00:06:44.927513 2911 kubelet.go:2436] "Starting kubelet main sync loop" Jan 28 00:06:44.927576 kubelet[2911]: E0128 00:06:44.927551 2911 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 28 00:06:44.952933 kubelet[2911]: I0128 00:06:44.952907 2911 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 28 00:06:44.952933 kubelet[2911]: I0128 00:06:44.952924 2911 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 28 00:06:44.952933 kubelet[2911]: I0128 00:06:44.952945 2911 state_mem.go:36] "Initialized new in-memory state store" Jan 28 00:06:44.953099 kubelet[2911]: I0128 00:06:44.953073 2911 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 28 00:06:44.953126 kubelet[2911]: I0128 00:06:44.953094 2911 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 28 00:06:44.953126 kubelet[2911]: I0128 00:06:44.953115 2911 policy_none.go:49] "None policy: Start" Jan 28 00:06:44.953126 kubelet[2911]: I0128 00:06:44.953123 2911 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 28 00:06:44.953191 kubelet[2911]: I0128 00:06:44.953131 2911 state_mem.go:35] "Initializing new in-memory state store" Jan 28 00:06:44.953246 kubelet[2911]: I0128 00:06:44.953234 2911 state_mem.go:75] "Updated machine memory state" Jan 28 00:06:44.957358 kubelet[2911]: E0128 00:06:44.957276 2911 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 28 00:06:44.957872 kubelet[2911]: I0128 00:06:44.957856 2911 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 28 00:06:44.957919 kubelet[2911]: I0128 00:06:44.957872 2911 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 28 00:06:44.958115 kubelet[2911]: I0128 00:06:44.958098 2911 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 28 00:06:44.961990 kubelet[2911]: E0128 00:06:44.961954 2911 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 28 00:06:45.028614 kubelet[2911]: I0128 00:06:45.028566 2911 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.028959 kubelet[2911]: I0128 00:06:45.028813 2911 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.030305 kubelet[2911]: I0128 00:06:45.029245 2911 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.037045 kubelet[2911]: E0128 00:06:45.037011 2911 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4593-0-0-n-ea467cc685\" already exists" pod="kube-system/kube-apiserver-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.060237 kubelet[2911]: I0128 00:06:45.060190 2911 kubelet_node_status.go:75] "Attempting to register node" node="ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.067418 kubelet[2911]: I0128 00:06:45.067371 2911 kubelet_node_status.go:124] "Node was previously registered" node="ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.067492 kubelet[2911]: I0128 00:06:45.067454 2911 kubelet_node_status.go:78] "Successfully registered node" node="ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.215576 kubelet[2911]: I0128 00:06:45.215426 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2754debdbff30c1fe5fc954f9f9154cc-ca-certs\") pod \"kube-apiserver-ci-4593-0-0-n-ea467cc685\" (UID: \"2754debdbff30c1fe5fc954f9f9154cc\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.215576 kubelet[2911]: I0128 00:06:45.215515 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2754debdbff30c1fe5fc954f9f9154cc-k8s-certs\") pod \"kube-apiserver-ci-4593-0-0-n-ea467cc685\" (UID: \"2754debdbff30c1fe5fc954f9f9154cc\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.215576 kubelet[2911]: I0128 00:06:45.215539 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3863bc8f487de77322695b91903e5bcc-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4593-0-0-n-ea467cc685\" (UID: \"3863bc8f487de77322695b91903e5bcc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.215721 kubelet[2911]: I0128 00:06:45.215580 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2754debdbff30c1fe5fc954f9f9154cc-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4593-0-0-n-ea467cc685\" (UID: \"2754debdbff30c1fe5fc954f9f9154cc\") " pod="kube-system/kube-apiserver-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.215721 kubelet[2911]: I0128 00:06:45.215599 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3863bc8f487de77322695b91903e5bcc-ca-certs\") pod \"kube-controller-manager-ci-4593-0-0-n-ea467cc685\" (UID: \"3863bc8f487de77322695b91903e5bcc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.215721 kubelet[2911]: I0128 00:06:45.215619 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3863bc8f487de77322695b91903e5bcc-flexvolume-dir\") pod \"kube-controller-manager-ci-4593-0-0-n-ea467cc685\" (UID: \"3863bc8f487de77322695b91903e5bcc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.215721 kubelet[2911]: I0128 00:06:45.215656 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3863bc8f487de77322695b91903e5bcc-k8s-certs\") pod \"kube-controller-manager-ci-4593-0-0-n-ea467cc685\" (UID: \"3863bc8f487de77322695b91903e5bcc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.215721 kubelet[2911]: I0128 00:06:45.215692 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3863bc8f487de77322695b91903e5bcc-kubeconfig\") pod \"kube-controller-manager-ci-4593-0-0-n-ea467cc685\" (UID: \"3863bc8f487de77322695b91903e5bcc\") " pod="kube-system/kube-controller-manager-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.215820 kubelet[2911]: I0128 00:06:45.215714 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6f8682244d88c049ee06bd0eb079b290-kubeconfig\") pod \"kube-scheduler-ci-4593-0-0-n-ea467cc685\" (UID: \"6f8682244d88c049ee06bd0eb079b290\") " pod="kube-system/kube-scheduler-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.889459 kubelet[2911]: I0128 00:06:45.889306 2911 apiserver.go:52] "Watching apiserver" Jan 28 00:06:45.914164 kubelet[2911]: I0128 00:06:45.914109 2911 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 28 00:06:45.938258 kubelet[2911]: I0128 00:06:45.938116 2911 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.938258 kubelet[2911]: I0128 00:06:45.938225 2911 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.944473 kubelet[2911]: E0128 00:06:45.944441 2911 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4593-0-0-n-ea467cc685\" already exists" pod="kube-system/kube-apiserver-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.945200 kubelet[2911]: E0128 00:06:45.945169 2911 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4593-0-0-n-ea467cc685\" already exists" pod="kube-system/kube-controller-manager-ci-4593-0-0-n-ea467cc685" Jan 28 00:06:45.983553 kubelet[2911]: I0128 00:06:45.983487 2911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4593-0-0-n-ea467cc685" podStartSLOduration=0.983470975 podStartE2EDuration="983.470975ms" podCreationTimestamp="2026-01-28 00:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 00:06:45.969812013 +0000 UTC m=+1.128988230" watchObservedRunningTime="2026-01-28 00:06:45.983470975 +0000 UTC m=+1.142647232" Jan 28 00:06:45.991986 kubelet[2911]: I0128 00:06:45.991927 2911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4593-0-0-n-ea467cc685" podStartSLOduration=2.99191136 podStartE2EDuration="2.99191136s" podCreationTimestamp="2026-01-28 00:06:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 00:06:45.983450695 +0000 UTC m=+1.142626952" watchObservedRunningTime="2026-01-28 00:06:45.99191136 +0000 UTC m=+1.151087617" Jan 28 00:06:50.604032 kubelet[2911]: I0128 00:06:50.603973 2911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4593-0-0-n-ea467cc685" podStartSLOduration=5.603958608 podStartE2EDuration="5.603958608s" podCreationTimestamp="2026-01-28 00:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 00:06:45.992110481 +0000 UTC m=+1.151286738" watchObservedRunningTime="2026-01-28 00:06:50.603958608 +0000 UTC m=+5.763134945" Jan 28 00:06:50.616216 kubelet[2911]: I0128 00:06:50.616173 2911 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 28 00:06:50.616517 containerd[1660]: time="2026-01-28T00:06:50.616476046Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 28 00:06:50.616890 kubelet[2911]: I0128 00:06:50.616617 2911 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 28 00:06:51.368592 systemd[1]: Created slice kubepods-besteffort-podf6eb6cfc_a347_465c_a0d6_f65d66f70310.slice - libcontainer container kubepods-besteffort-podf6eb6cfc_a347_465c_a0d6_f65d66f70310.slice. Jan 28 00:06:51.452832 kubelet[2911]: I0128 00:06:51.452708 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/f6eb6cfc-a347-465c-a0d6-f65d66f70310-kube-proxy\") pod \"kube-proxy-lsqnw\" (UID: \"f6eb6cfc-a347-465c-a0d6-f65d66f70310\") " pod="kube-system/kube-proxy-lsqnw" Jan 28 00:06:51.452832 kubelet[2911]: I0128 00:06:51.452833 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f6eb6cfc-a347-465c-a0d6-f65d66f70310-xtables-lock\") pod \"kube-proxy-lsqnw\" (UID: \"f6eb6cfc-a347-465c-a0d6-f65d66f70310\") " pod="kube-system/kube-proxy-lsqnw" Jan 28 00:06:51.452968 kubelet[2911]: I0128 00:06:51.452851 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f6eb6cfc-a347-465c-a0d6-f65d66f70310-lib-modules\") pod \"kube-proxy-lsqnw\" (UID: \"f6eb6cfc-a347-465c-a0d6-f65d66f70310\") " pod="kube-system/kube-proxy-lsqnw" Jan 28 00:06:51.452968 kubelet[2911]: I0128 00:06:51.452902 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c9mn\" (UniqueName: \"kubernetes.io/projected/f6eb6cfc-a347-465c-a0d6-f65d66f70310-kube-api-access-6c9mn\") pod \"kube-proxy-lsqnw\" (UID: \"f6eb6cfc-a347-465c-a0d6-f65d66f70310\") " pod="kube-system/kube-proxy-lsqnw" Jan 28 00:06:51.683623 containerd[1660]: time="2026-01-28T00:06:51.683440207Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lsqnw,Uid:f6eb6cfc-a347-465c-a0d6-f65d66f70310,Namespace:kube-system,Attempt:0,}" Jan 28 00:06:51.706666 containerd[1660]: time="2026-01-28T00:06:51.706626118Z" level=info msg="connecting to shim cbb6d17195071d052884ae9560ee494b5bea65f59e96b46eed6939f3dc2d717f" address="unix:///run/containerd/s/10b7e15457d7df17d38ed2fd7213cbe32d705ff445b7adb4ed0a95dbbe418cf9" namespace=k8s.io protocol=ttrpc version=3 Jan 28 00:06:51.731668 systemd[1]: Started cri-containerd-cbb6d17195071d052884ae9560ee494b5bea65f59e96b46eed6939f3dc2d717f.scope - libcontainer container cbb6d17195071d052884ae9560ee494b5bea65f59e96b46eed6939f3dc2d717f. Jan 28 00:06:51.738000 audit: BPF prog-id=133 op=LOAD Jan 28 00:06:51.741027 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 28 00:06:51.741088 kernel: audit: type=1334 audit(1769558811.738:439): prog-id=133 op=LOAD Jan 28 00:06:51.740000 audit: BPF prog-id=134 op=LOAD Jan 28 00:06:51.743016 kernel: audit: type=1334 audit(1769558811.740:440): prog-id=134 op=LOAD Jan 28 00:06:51.743069 kernel: audit: type=1300 audit(1769558811.740:440): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=2975 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:51.740000 audit[2986]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=2975 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:51.740000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6362623664313731393530373164303532383834616539353630656534 Jan 28 00:06:51.750837 kernel: audit: type=1327 audit(1769558811.740:440): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6362623664313731393530373164303532383834616539353630656534 Jan 28 00:06:51.740000 audit: BPF prog-id=134 op=UNLOAD Jan 28 00:06:51.753109 kernel: audit: type=1334 audit(1769558811.740:441): prog-id=134 op=UNLOAD Jan 28 00:06:51.753166 kernel: audit: type=1300 audit(1769558811.740:441): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2975 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:51.740000 audit[2986]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2975 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:51.757438 kernel: audit: type=1327 audit(1769558811.740:441): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6362623664313731393530373164303532383834616539353630656534 Jan 28 00:06:51.740000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6362623664313731393530373164303532383834616539353630656534 Jan 28 00:06:51.740000 audit: BPF prog-id=135 op=LOAD Jan 28 00:06:51.763003 kernel: audit: type=1334 audit(1769558811.740:442): prog-id=135 op=LOAD Jan 28 00:06:51.740000 audit[2986]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=2975 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:51.767552 kernel: audit: type=1300 audit(1769558811.740:442): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=2975 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:51.768061 kernel: audit: type=1327 audit(1769558811.740:442): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6362623664313731393530373164303532383834616539353630656534 Jan 28 00:06:51.740000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6362623664313731393530373164303532383834616539353630656534 Jan 28 00:06:51.741000 audit: BPF prog-id=136 op=LOAD Jan 28 00:06:51.741000 audit[2986]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=2975 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:51.741000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6362623664313731393530373164303532383834616539353630656534 Jan 28 00:06:51.745000 audit: BPF prog-id=136 op=UNLOAD Jan 28 00:06:51.745000 audit[2986]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2975 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:51.745000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6362623664313731393530373164303532383834616539353630656534 Jan 28 00:06:51.745000 audit: BPF prog-id=135 op=UNLOAD Jan 28 00:06:51.745000 audit[2986]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2975 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:51.745000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6362623664313731393530373164303532383834616539353630656534 Jan 28 00:06:51.745000 audit: BPF prog-id=137 op=LOAD Jan 28 00:06:51.745000 audit[2986]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=2975 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:51.745000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6362623664313731393530373164303532383834616539353630656534 Jan 28 00:06:51.794146 containerd[1660]: time="2026-01-28T00:06:51.794106503Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lsqnw,Uid:f6eb6cfc-a347-465c-a0d6-f65d66f70310,Namespace:kube-system,Attempt:0,} returns sandbox id \"cbb6d17195071d052884ae9560ee494b5bea65f59e96b46eed6939f3dc2d717f\"" Jan 28 00:06:51.794712 systemd[1]: Created slice kubepods-besteffort-podf70770dd_5d44_418d_9614_1efca57002ab.slice - libcontainer container kubepods-besteffort-podf70770dd_5d44_418d_9614_1efca57002ab.slice. Jan 28 00:06:51.803372 containerd[1660]: time="2026-01-28T00:06:51.803332251Z" level=info msg="CreateContainer within sandbox \"cbb6d17195071d052884ae9560ee494b5bea65f59e96b46eed6939f3dc2d717f\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 28 00:06:51.815320 containerd[1660]: time="2026-01-28T00:06:51.815145247Z" level=info msg="Container 4258cde8920497b2ff661fbefbe18d322362c3f18af9189f6e9a9a34e9f24471: CDI devices from CRI Config.CDIDevices: []" Jan 28 00:06:51.825478 containerd[1660]: time="2026-01-28T00:06:51.825434318Z" level=info msg="CreateContainer within sandbox \"cbb6d17195071d052884ae9560ee494b5bea65f59e96b46eed6939f3dc2d717f\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"4258cde8920497b2ff661fbefbe18d322362c3f18af9189f6e9a9a34e9f24471\"" Jan 28 00:06:51.826310 containerd[1660]: time="2026-01-28T00:06:51.826285601Z" level=info msg="StartContainer for \"4258cde8920497b2ff661fbefbe18d322362c3f18af9189f6e9a9a34e9f24471\"" Jan 28 00:06:51.828140 containerd[1660]: time="2026-01-28T00:06:51.828077686Z" level=info msg="connecting to shim 4258cde8920497b2ff661fbefbe18d322362c3f18af9189f6e9a9a34e9f24471" address="unix:///run/containerd/s/10b7e15457d7df17d38ed2fd7213cbe32d705ff445b7adb4ed0a95dbbe418cf9" protocol=ttrpc version=3 Jan 28 00:06:51.854624 kubelet[2911]: I0128 00:06:51.854552 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zlfk\" (UniqueName: \"kubernetes.io/projected/f70770dd-5d44-418d-9614-1efca57002ab-kube-api-access-6zlfk\") pod \"tigera-operator-7dcd859c48-gqx82\" (UID: \"f70770dd-5d44-418d-9614-1efca57002ab\") " pod="tigera-operator/tigera-operator-7dcd859c48-gqx82" Jan 28 00:06:51.855149 kubelet[2911]: I0128 00:06:51.854661 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f70770dd-5d44-418d-9614-1efca57002ab-var-lib-calico\") pod \"tigera-operator-7dcd859c48-gqx82\" (UID: \"f70770dd-5d44-418d-9614-1efca57002ab\") " pod="tigera-operator/tigera-operator-7dcd859c48-gqx82" Jan 28 00:06:51.856578 systemd[1]: Started cri-containerd-4258cde8920497b2ff661fbefbe18d322362c3f18af9189f6e9a9a34e9f24471.scope - libcontainer container 4258cde8920497b2ff661fbefbe18d322362c3f18af9189f6e9a9a34e9f24471. Jan 28 00:06:51.914000 audit: BPF prog-id=138 op=LOAD Jan 28 00:06:51.914000 audit[3011]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2975 pid=3011 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:51.914000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3432353863646538393230343937623266663636316662656662653138 Jan 28 00:06:51.914000 audit: BPF prog-id=139 op=LOAD Jan 28 00:06:51.914000 audit[3011]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2975 pid=3011 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:51.914000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3432353863646538393230343937623266663636316662656662653138 Jan 28 00:06:51.914000 audit: BPF prog-id=139 op=UNLOAD Jan 28 00:06:51.914000 audit[3011]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2975 pid=3011 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:51.914000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3432353863646538393230343937623266663636316662656662653138 Jan 28 00:06:51.914000 audit: BPF prog-id=138 op=UNLOAD Jan 28 00:06:51.914000 audit[3011]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2975 pid=3011 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:51.914000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3432353863646538393230343937623266663636316662656662653138 Jan 28 00:06:51.914000 audit: BPF prog-id=140 op=LOAD Jan 28 00:06:51.914000 audit[3011]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2975 pid=3011 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:51.914000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3432353863646538393230343937623266663636316662656662653138 Jan 28 00:06:51.933178 containerd[1660]: time="2026-01-28T00:06:51.933104605Z" level=info msg="StartContainer for \"4258cde8920497b2ff661fbefbe18d322362c3f18af9189f6e9a9a34e9f24471\" returns successfully" Jan 28 00:06:51.970193 kubelet[2911]: I0128 00:06:51.969451 2911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-lsqnw" podStartSLOduration=0.969433876 podStartE2EDuration="969.433876ms" podCreationTimestamp="2026-01-28 00:06:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 00:06:51.968589873 +0000 UTC m=+7.127766130" watchObservedRunningTime="2026-01-28 00:06:51.969433876 +0000 UTC m=+7.128610133" Jan 28 00:06:52.098498 containerd[1660]: time="2026-01-28T00:06:52.098320827Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-gqx82,Uid:f70770dd-5d44-418d-9614-1efca57002ab,Namespace:tigera-operator,Attempt:0,}" Jan 28 00:06:52.097000 audit[3078]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3078 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.097000 audit[3078]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc4bfef50 a2=0 a3=1 items=0 ppid=3024 pid=3078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.097000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 28 00:06:52.097000 audit[3079]: NETFILTER_CFG table=mangle:55 family=10 entries=1 op=nft_register_chain pid=3079 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.097000 audit[3079]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe0069ea0 a2=0 a3=1 items=0 ppid=3024 pid=3079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.097000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 28 00:06:52.098000 audit[3080]: NETFILTER_CFG table=nat:56 family=2 entries=1 op=nft_register_chain pid=3080 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.098000 audit[3080]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdc2e7d90 a2=0 a3=1 items=0 ppid=3024 pid=3080 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.098000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 28 00:06:52.100000 audit[3084]: NETFILTER_CFG table=filter:57 family=2 entries=1 op=nft_register_chain pid=3084 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.100000 audit[3084]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffdd05cdf0 a2=0 a3=1 items=0 ppid=3024 pid=3084 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.100000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 28 00:06:52.101000 audit[3082]: NETFILTER_CFG table=nat:58 family=10 entries=1 op=nft_register_chain pid=3082 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.101000 audit[3082]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe224a6d0 a2=0 a3=1 items=0 ppid=3024 pid=3082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.101000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 28 00:06:52.106000 audit[3086]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3086 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.106000 audit[3086]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffea57aad0 a2=0 a3=1 items=0 ppid=3024 pid=3086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.106000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 28 00:06:52.125831 containerd[1660]: time="2026-01-28T00:06:52.125770871Z" level=info msg="connecting to shim c73dc99a210b5ddbf96cf51a2198d161da4552aace3c72437d72d757a1b5dc16" address="unix:///run/containerd/s/93d3164be675932369a99174d25a07c7b91e2c779648f0f553888eacb21e5c2a" namespace=k8s.io protocol=ttrpc version=3 Jan 28 00:06:52.149440 systemd[1]: Started cri-containerd-c73dc99a210b5ddbf96cf51a2198d161da4552aace3c72437d72d757a1b5dc16.scope - libcontainer container c73dc99a210b5ddbf96cf51a2198d161da4552aace3c72437d72d757a1b5dc16. Jan 28 00:06:52.159000 audit: BPF prog-id=141 op=LOAD Jan 28 00:06:52.160000 audit: BPF prog-id=142 op=LOAD Jan 28 00:06:52.160000 audit[3107]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3094 pid=3107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.160000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337336463393961323130623564646266393663663531613231393864 Jan 28 00:06:52.160000 audit: BPF prog-id=142 op=UNLOAD Jan 28 00:06:52.160000 audit[3107]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3094 pid=3107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.160000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337336463393961323130623564646266393663663531613231393864 Jan 28 00:06:52.160000 audit: BPF prog-id=143 op=LOAD Jan 28 00:06:52.160000 audit[3107]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3094 pid=3107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.160000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337336463393961323130623564646266393663663531613231393864 Jan 28 00:06:52.160000 audit: BPF prog-id=144 op=LOAD Jan 28 00:06:52.160000 audit[3107]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3094 pid=3107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.160000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337336463393961323130623564646266393663663531613231393864 Jan 28 00:06:52.160000 audit: BPF prog-id=144 op=UNLOAD Jan 28 00:06:52.160000 audit[3107]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3094 pid=3107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.160000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337336463393961323130623564646266393663663531613231393864 Jan 28 00:06:52.160000 audit: BPF prog-id=143 op=UNLOAD Jan 28 00:06:52.160000 audit[3107]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3094 pid=3107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.160000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337336463393961323130623564646266393663663531613231393864 Jan 28 00:06:52.160000 audit: BPF prog-id=145 op=LOAD Jan 28 00:06:52.160000 audit[3107]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3094 pid=3107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.160000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337336463393961323130623564646266393663663531613231393864 Jan 28 00:06:52.183988 containerd[1660]: time="2026-01-28T00:06:52.183937847Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-gqx82,Uid:f70770dd-5d44-418d-9614-1efca57002ab,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"c73dc99a210b5ddbf96cf51a2198d161da4552aace3c72437d72d757a1b5dc16\"" Jan 28 00:06:52.186386 containerd[1660]: time="2026-01-28T00:06:52.186347935Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 28 00:06:52.203000 audit[3132]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3132 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.203000 audit[3132]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=fffff51e05d0 a2=0 a3=1 items=0 ppid=3024 pid=3132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.203000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 28 00:06:52.205000 audit[3134]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3134 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.205000 audit[3134]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffcf017390 a2=0 a3=1 items=0 ppid=3024 pid=3134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.205000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 28 00:06:52.209000 audit[3137]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3137 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.209000 audit[3137]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=fffff4df12a0 a2=0 a3=1 items=0 ppid=3024 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.209000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 28 00:06:52.210000 audit[3138]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3138 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.210000 audit[3138]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe2c9fb40 a2=0 a3=1 items=0 ppid=3024 pid=3138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.210000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 28 00:06:52.212000 audit[3140]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3140 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.212000 audit[3140]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff3f57190 a2=0 a3=1 items=0 ppid=3024 pid=3140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.212000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 28 00:06:52.213000 audit[3141]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3141 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.213000 audit[3141]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffcbefcb30 a2=0 a3=1 items=0 ppid=3024 pid=3141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.213000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 28 00:06:52.215000 audit[3143]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3143 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.215000 audit[3143]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=fffff75807b0 a2=0 a3=1 items=0 ppid=3024 pid=3143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.215000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 28 00:06:52.218000 audit[3146]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3146 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.218000 audit[3146]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffcef0dc00 a2=0 a3=1 items=0 ppid=3024 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.218000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 28 00:06:52.219000 audit[3147]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3147 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.219000 audit[3147]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd3402380 a2=0 a3=1 items=0 ppid=3024 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.219000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 28 00:06:52.222000 audit[3149]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3149 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.222000 audit[3149]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffd26f13c0 a2=0 a3=1 items=0 ppid=3024 pid=3149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.222000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 28 00:06:52.222000 audit[3150]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3150 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.222000 audit[3150]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc85e22e0 a2=0 a3=1 items=0 ppid=3024 pid=3150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.222000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 28 00:06:52.225000 audit[3152]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3152 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.225000 audit[3152]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd9fc8570 a2=0 a3=1 items=0 ppid=3024 pid=3152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.225000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 28 00:06:52.230000 audit[3155]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3155 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.230000 audit[3155]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffc5675640 a2=0 a3=1 items=0 ppid=3024 pid=3155 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.230000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 28 00:06:52.233000 audit[3158]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3158 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.233000 audit[3158]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffca1ef9a0 a2=0 a3=1 items=0 ppid=3024 pid=3158 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.233000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 28 00:06:52.234000 audit[3159]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3159 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.234000 audit[3159]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd8357f00 a2=0 a3=1 items=0 ppid=3024 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.234000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 28 00:06:52.237000 audit[3161]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3161 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.237000 audit[3161]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffe744eae0 a2=0 a3=1 items=0 ppid=3024 pid=3161 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.237000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 28 00:06:52.240000 audit[3164]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3164 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.240000 audit[3164]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffc53dcce0 a2=0 a3=1 items=0 ppid=3024 pid=3164 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.240000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 28 00:06:52.240000 audit[3165]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3165 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.240000 audit[3165]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe7048bf0 a2=0 a3=1 items=0 ppid=3024 pid=3165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.240000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 28 00:06:52.243000 audit[3167]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3167 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 28 00:06:52.243000 audit[3167]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=ffffd818d520 a2=0 a3=1 items=0 ppid=3024 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.243000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 28 00:06:52.262000 audit[3173]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3173 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:06:52.262000 audit[3173]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff386db90 a2=0 a3=1 items=0 ppid=3024 pid=3173 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.262000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:06:52.281000 audit[3173]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3173 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:06:52.281000 audit[3173]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=fffff386db90 a2=0 a3=1 items=0 ppid=3024 pid=3173 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.281000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:06:52.283000 audit[3178]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3178 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.283000 audit[3178]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffef4a61c0 a2=0 a3=1 items=0 ppid=3024 pid=3178 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.283000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 28 00:06:52.285000 audit[3180]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3180 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.285000 audit[3180]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=ffffdeab6d30 a2=0 a3=1 items=0 ppid=3024 pid=3180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.285000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 28 00:06:52.289000 audit[3183]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3183 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.289000 audit[3183]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffc0bf3040 a2=0 a3=1 items=0 ppid=3024 pid=3183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.289000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 28 00:06:52.290000 audit[3184]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3184 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.290000 audit[3184]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffce49ca30 a2=0 a3=1 items=0 ppid=3024 pid=3184 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.290000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 28 00:06:52.294000 audit[3186]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3186 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.294000 audit[3186]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffc190d6f0 a2=0 a3=1 items=0 ppid=3024 pid=3186 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.294000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 28 00:06:52.295000 audit[3187]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3187 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.295000 audit[3187]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffd420880 a2=0 a3=1 items=0 ppid=3024 pid=3187 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.295000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 28 00:06:52.298000 audit[3189]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3189 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.298000 audit[3189]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=fffffa0a4b40 a2=0 a3=1 items=0 ppid=3024 pid=3189 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.298000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 28 00:06:52.301000 audit[3192]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3192 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.301000 audit[3192]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=ffffe5098400 a2=0 a3=1 items=0 ppid=3024 pid=3192 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.301000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 28 00:06:52.303000 audit[3193]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3193 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.303000 audit[3193]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff447a1a0 a2=0 a3=1 items=0 ppid=3024 pid=3193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.303000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 28 00:06:52.306000 audit[3195]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3195 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.306000 audit[3195]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffe02b8600 a2=0 a3=1 items=0 ppid=3024 pid=3195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.306000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 28 00:06:52.308000 audit[3196]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3196 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.308000 audit[3196]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffcf0dfa70 a2=0 a3=1 items=0 ppid=3024 pid=3196 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.308000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 28 00:06:52.310000 audit[3198]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3198 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.310000 audit[3198]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffce0f1ec0 a2=0 a3=1 items=0 ppid=3024 pid=3198 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.310000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 28 00:06:52.314000 audit[3201]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3201 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.314000 audit[3201]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff3721b20 a2=0 a3=1 items=0 ppid=3024 pid=3201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.314000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 28 00:06:52.317000 audit[3204]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3204 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.317000 audit[3204]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffc69518a0 a2=0 a3=1 items=0 ppid=3024 pid=3204 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.317000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 28 00:06:52.318000 audit[3205]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3205 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.318000 audit[3205]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffff9c709c0 a2=0 a3=1 items=0 ppid=3024 pid=3205 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.318000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 28 00:06:52.321000 audit[3207]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3207 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.321000 audit[3207]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=fffffd3cea70 a2=0 a3=1 items=0 ppid=3024 pid=3207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.321000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 28 00:06:52.324000 audit[3210]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3210 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.324000 audit[3210]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffea68fc30 a2=0 a3=1 items=0 ppid=3024 pid=3210 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.324000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 28 00:06:52.325000 audit[3211]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3211 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.325000 audit[3211]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd153ca00 a2=0 a3=1 items=0 ppid=3024 pid=3211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.325000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 28 00:06:52.327000 audit[3213]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3213 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.327000 audit[3213]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=ffffe47e5440 a2=0 a3=1 items=0 ppid=3024 pid=3213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.327000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 28 00:06:52.328000 audit[3214]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3214 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.328000 audit[3214]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff4d60830 a2=0 a3=1 items=0 ppid=3024 pid=3214 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.328000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 28 00:06:52.331000 audit[3216]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3216 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.331000 audit[3216]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffd189f810 a2=0 a3=1 items=0 ppid=3024 pid=3216 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.331000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 28 00:06:52.334000 audit[3219]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3219 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 28 00:06:52.334000 audit[3219]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffd01dc290 a2=0 a3=1 items=0 ppid=3024 pid=3219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.334000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 28 00:06:52.337000 audit[3221]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3221 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 28 00:06:52.337000 audit[3221]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2088 a0=3 a1=ffffc63471c0 a2=0 a3=1 items=0 ppid=3024 pid=3221 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.337000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:06:52.338000 audit[3221]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3221 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 28 00:06:52.338000 audit[3221]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=ffffc63471c0 a2=0 a3=1 items=0 ppid=3024 pid=3221 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:52.338000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:06:53.850276 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3845304384.mount: Deactivated successfully. Jan 28 00:06:54.127039 containerd[1660]: time="2026-01-28T00:06:54.126879388Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:54.128843 containerd[1660]: time="2026-01-28T00:06:54.128800314Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=20773434" Jan 28 00:06:54.129901 containerd[1660]: time="2026-01-28T00:06:54.129876438Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:54.132543 containerd[1660]: time="2026-01-28T00:06:54.132492126Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:06:54.133222 containerd[1660]: time="2026-01-28T00:06:54.133182128Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 1.946795633s" Jan 28 00:06:54.133264 containerd[1660]: time="2026-01-28T00:06:54.133223408Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Jan 28 00:06:54.137719 containerd[1660]: time="2026-01-28T00:06:54.137684301Z" level=info msg="CreateContainer within sandbox \"c73dc99a210b5ddbf96cf51a2198d161da4552aace3c72437d72d757a1b5dc16\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 28 00:06:54.150322 containerd[1660]: time="2026-01-28T00:06:54.150284540Z" level=info msg="Container 30690ce815733bdf1b4e79d1203cc2e8d3365bceb6b69db38dc176a36b6a8e6b: CDI devices from CRI Config.CDIDevices: []" Jan 28 00:06:54.157641 containerd[1660]: time="2026-01-28T00:06:54.157536922Z" level=info msg="CreateContainer within sandbox \"c73dc99a210b5ddbf96cf51a2198d161da4552aace3c72437d72d757a1b5dc16\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"30690ce815733bdf1b4e79d1203cc2e8d3365bceb6b69db38dc176a36b6a8e6b\"" Jan 28 00:06:54.158379 containerd[1660]: time="2026-01-28T00:06:54.158343004Z" level=info msg="StartContainer for \"30690ce815733bdf1b4e79d1203cc2e8d3365bceb6b69db38dc176a36b6a8e6b\"" Jan 28 00:06:54.159283 containerd[1660]: time="2026-01-28T00:06:54.159253207Z" level=info msg="connecting to shim 30690ce815733bdf1b4e79d1203cc2e8d3365bceb6b69db38dc176a36b6a8e6b" address="unix:///run/containerd/s/93d3164be675932369a99174d25a07c7b91e2c779648f0f553888eacb21e5c2a" protocol=ttrpc version=3 Jan 28 00:06:54.179647 systemd[1]: Started cri-containerd-30690ce815733bdf1b4e79d1203cc2e8d3365bceb6b69db38dc176a36b6a8e6b.scope - libcontainer container 30690ce815733bdf1b4e79d1203cc2e8d3365bceb6b69db38dc176a36b6a8e6b. Jan 28 00:06:54.187000 audit: BPF prog-id=146 op=LOAD Jan 28 00:06:54.188000 audit: BPF prog-id=147 op=LOAD Jan 28 00:06:54.188000 audit[3231]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=3094 pid=3231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:54.188000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330363930636538313537333362646631623465373964313230336363 Jan 28 00:06:54.188000 audit: BPF prog-id=147 op=UNLOAD Jan 28 00:06:54.188000 audit[3231]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3094 pid=3231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:54.188000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330363930636538313537333362646631623465373964313230336363 Jan 28 00:06:54.188000 audit: BPF prog-id=148 op=LOAD Jan 28 00:06:54.188000 audit[3231]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3094 pid=3231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:54.188000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330363930636538313537333362646631623465373964313230336363 Jan 28 00:06:54.188000 audit: BPF prog-id=149 op=LOAD Jan 28 00:06:54.188000 audit[3231]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3094 pid=3231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:54.188000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330363930636538313537333362646631623465373964313230336363 Jan 28 00:06:54.188000 audit: BPF prog-id=149 op=UNLOAD Jan 28 00:06:54.188000 audit[3231]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3094 pid=3231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:54.188000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330363930636538313537333362646631623465373964313230336363 Jan 28 00:06:54.188000 audit: BPF prog-id=148 op=UNLOAD Jan 28 00:06:54.188000 audit[3231]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3094 pid=3231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:54.188000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330363930636538313537333362646631623465373964313230336363 Jan 28 00:06:54.188000 audit: BPF prog-id=150 op=LOAD Jan 28 00:06:54.188000 audit[3231]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3094 pid=3231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:06:54.188000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3330363930636538313537333362646631623465373964313230336363 Jan 28 00:06:54.204020 containerd[1660]: time="2026-01-28T00:06:54.203944583Z" level=info msg="StartContainer for \"30690ce815733bdf1b4e79d1203cc2e8d3365bceb6b69db38dc176a36b6a8e6b\" returns successfully" Jan 28 00:06:55.499615 kubelet[2911]: I0128 00:06:55.499477 2911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-gqx82" podStartSLOduration=2.550383277 podStartE2EDuration="4.499460877s" podCreationTimestamp="2026-01-28 00:06:51 +0000 UTC" firstStartedPulling="2026-01-28 00:06:52.18494301 +0000 UTC m=+7.344119267" lastFinishedPulling="2026-01-28 00:06:54.13402061 +0000 UTC m=+9.293196867" observedRunningTime="2026-01-28 00:06:54.970958192 +0000 UTC m=+10.130134449" watchObservedRunningTime="2026-01-28 00:06:55.499460877 +0000 UTC m=+10.658637094" Jan 28 00:06:59.515830 sudo[1938]: pam_unix(sudo:session): session closed for user root Jan 28 00:06:59.519593 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 28 00:06:59.519643 kernel: audit: type=1106 audit(1769558819.514:519): pid=1938 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 00:06:59.514000 audit[1938]: USER_END pid=1938 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 00:06:59.514000 audit[1938]: CRED_DISP pid=1938 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 00:06:59.522655 kernel: audit: type=1104 audit(1769558819.514:520): pid=1938 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 28 00:06:59.613346 sshd[1937]: Connection closed by 4.153.228.146 port 59562 Jan 28 00:06:59.614007 sshd-session[1933]: pam_unix(sshd:session): session closed for user core Jan 28 00:06:59.613000 audit[1933]: USER_END pid=1933 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:06:59.614000 audit[1933]: CRED_DISP pid=1933 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:06:59.621880 systemd[1]: sshd@6-10.0.1.105:22-4.153.228.146:59562.service: Deactivated successfully. Jan 28 00:06:59.622332 kernel: audit: type=1106 audit(1769558819.613:521): pid=1933 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:06:59.624101 systemd[1]: session-8.scope: Deactivated successfully. Jan 28 00:06:59.627621 systemd[1]: session-8.scope: Consumed 7.201s CPU time, 224.4M memory peak. Jan 28 00:06:59.621000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.1.105:22-4.153.228.146:59562 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:59.633596 kernel: audit: type=1104 audit(1769558819.614:522): pid=1933 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:06:59.633671 kernel: audit: type=1131 audit(1769558819.621:523): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.1.105:22-4.153.228.146:59562 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:06:59.634915 systemd-logind[1641]: Session 8 logged out. Waiting for processes to exit. Jan 28 00:06:59.636761 systemd-logind[1641]: Removed session 8. Jan 28 00:07:00.754000 audit[3325]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3325 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:00.754000 audit[3325]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffeb9875f0 a2=0 a3=1 items=0 ppid=3024 pid=3325 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:00.761655 kernel: audit: type=1325 audit(1769558820.754:524): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3325 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:00.761732 kernel: audit: type=1300 audit(1769558820.754:524): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffeb9875f0 a2=0 a3=1 items=0 ppid=3024 pid=3325 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:00.761773 kernel: audit: type=1327 audit(1769558820.754:524): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:00.754000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:00.765000 audit[3325]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3325 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:00.765000 audit[3325]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffeb9875f0 a2=0 a3=1 items=0 ppid=3024 pid=3325 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:00.772305 kernel: audit: type=1325 audit(1769558820.765:525): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3325 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:00.772366 kernel: audit: type=1300 audit(1769558820.765:525): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffeb9875f0 a2=0 a3=1 items=0 ppid=3024 pid=3325 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:00.765000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:00.778000 audit[3327]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:00.778000 audit[3327]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffef62ca80 a2=0 a3=1 items=0 ppid=3024 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:00.778000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:00.786000 audit[3327]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:00.786000 audit[3327]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffef62ca80 a2=0 a3=1 items=0 ppid=3024 pid=3327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:00.786000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:05.036000 audit[3329]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:05.038593 kernel: kauditd_printk_skb: 7 callbacks suppressed Jan 28 00:07:05.038653 kernel: audit: type=1325 audit(1769558825.036:528): table=filter:109 family=2 entries=17 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:05.036000 audit[3329]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffdb82b0a0 a2=0 a3=1 items=0 ppid=3024 pid=3329 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:05.043993 kernel: audit: type=1300 audit(1769558825.036:528): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffdb82b0a0 a2=0 a3=1 items=0 ppid=3024 pid=3329 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:05.036000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:05.047840 kernel: audit: type=1327 audit(1769558825.036:528): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:05.043000 audit[3329]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:05.050047 kernel: audit: type=1325 audit(1769558825.043:529): table=nat:110 family=2 entries=12 op=nft_register_rule pid=3329 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:05.043000 audit[3329]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffdb82b0a0 a2=0 a3=1 items=0 ppid=3024 pid=3329 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:05.054951 kernel: audit: type=1300 audit(1769558825.043:529): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffdb82b0a0 a2=0 a3=1 items=0 ppid=3024 pid=3329 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:05.043000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:05.058909 kernel: audit: type=1327 audit(1769558825.043:529): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:05.067000 audit[3331]: NETFILTER_CFG table=filter:111 family=2 entries=18 op=nft_register_rule pid=3331 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:05.067000 audit[3331]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffff9ceee90 a2=0 a3=1 items=0 ppid=3024 pid=3331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:05.074687 kernel: audit: type=1325 audit(1769558825.067:530): table=filter:111 family=2 entries=18 op=nft_register_rule pid=3331 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:05.074764 kernel: audit: type=1300 audit(1769558825.067:530): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffff9ceee90 a2=0 a3=1 items=0 ppid=3024 pid=3331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:05.067000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:05.077378 kernel: audit: type=1327 audit(1769558825.067:530): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:05.078000 audit[3331]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3331 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:05.078000 audit[3331]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff9ceee90 a2=0 a3=1 items=0 ppid=3024 pid=3331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:05.082388 kernel: audit: type=1325 audit(1769558825.078:531): table=nat:112 family=2 entries=12 op=nft_register_rule pid=3331 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:05.078000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:06.090000 audit[3333]: NETFILTER_CFG table=filter:113 family=2 entries=19 op=nft_register_rule pid=3333 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:06.090000 audit[3333]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffe16a0400 a2=0 a3=1 items=0 ppid=3024 pid=3333 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:06.090000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:06.098000 audit[3333]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3333 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:06.098000 audit[3333]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffe16a0400 a2=0 a3=1 items=0 ppid=3024 pid=3333 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:06.098000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:06.974301 systemd[1]: Created slice kubepods-besteffort-pod09ed7c93_5444_4175_9ad4_0b6ded96e17f.slice - libcontainer container kubepods-besteffort-pod09ed7c93_5444_4175_9ad4_0b6ded96e17f.slice. Jan 28 00:07:07.112000 audit[3335]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3335 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:07.112000 audit[3335]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=fffffc00c940 a2=0 a3=1 items=0 ppid=3024 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:07.112000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:07.139000 audit[3335]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3335 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:07.139000 audit[3335]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffffc00c940 a2=0 a3=1 items=0 ppid=3024 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:07.139000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:07.145407 kubelet[2911]: I0128 00:07:07.145372 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ed7c93-5444-4175-9ad4-0b6ded96e17f-tigera-ca-bundle\") pod \"calico-typha-7df599b88b-886ps\" (UID: \"09ed7c93-5444-4175-9ad4-0b6ded96e17f\") " pod="calico-system/calico-typha-7df599b88b-886ps" Jan 28 00:07:07.145407 kubelet[2911]: I0128 00:07:07.145414 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/09ed7c93-5444-4175-9ad4-0b6ded96e17f-typha-certs\") pod \"calico-typha-7df599b88b-886ps\" (UID: \"09ed7c93-5444-4175-9ad4-0b6ded96e17f\") " pod="calico-system/calico-typha-7df599b88b-886ps" Jan 28 00:07:07.145407 kubelet[2911]: I0128 00:07:07.145433 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx4rh\" (UniqueName: \"kubernetes.io/projected/09ed7c93-5444-4175-9ad4-0b6ded96e17f-kube-api-access-mx4rh\") pod \"calico-typha-7df599b88b-886ps\" (UID: \"09ed7c93-5444-4175-9ad4-0b6ded96e17f\") " pod="calico-system/calico-typha-7df599b88b-886ps" Jan 28 00:07:07.201631 systemd[1]: Created slice kubepods-besteffort-podb8620a33_5232_4037_bf39_b9c55f64a071.slice - libcontainer container kubepods-besteffort-podb8620a33_5232_4037_bf39_b9c55f64a071.slice. Jan 28 00:07:07.278985 containerd[1660]: time="2026-01-28T00:07:07.278812374Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7df599b88b-886ps,Uid:09ed7c93-5444-4175-9ad4-0b6ded96e17f,Namespace:calico-system,Attempt:0,}" Jan 28 00:07:07.304740 containerd[1660]: time="2026-01-28T00:07:07.304694653Z" level=info msg="connecting to shim ed423e7ecb2926f93d76226657c96c5a040eba6ac83f6fe55d30d77306aa1ebf" address="unix:///run/containerd/s/3b669c439da291e767c57736029eae72b40a1e26994de3e801520532bc2a5060" namespace=k8s.io protocol=ttrpc version=3 Jan 28 00:07:07.334426 systemd[1]: Started cri-containerd-ed423e7ecb2926f93d76226657c96c5a040eba6ac83f6fe55d30d77306aa1ebf.scope - libcontainer container ed423e7ecb2926f93d76226657c96c5a040eba6ac83f6fe55d30d77306aa1ebf. Jan 28 00:07:07.347282 kubelet[2911]: I0128 00:07:07.347235 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b8620a33-5232-4037-bf39-b9c55f64a071-cni-log-dir\") pod \"calico-node-zkxqr\" (UID: \"b8620a33-5232-4037-bf39-b9c55f64a071\") " pod="calico-system/calico-node-zkxqr" Jan 28 00:07:07.347282 kubelet[2911]: I0128 00:07:07.347277 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b8620a33-5232-4037-bf39-b9c55f64a071-policysync\") pod \"calico-node-zkxqr\" (UID: \"b8620a33-5232-4037-bf39-b9c55f64a071\") " pod="calico-system/calico-node-zkxqr" Jan 28 00:07:07.347418 kubelet[2911]: I0128 00:07:07.347296 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b8620a33-5232-4037-bf39-b9c55f64a071-var-run-calico\") pod \"calico-node-zkxqr\" (UID: \"b8620a33-5232-4037-bf39-b9c55f64a071\") " pod="calico-system/calico-node-zkxqr" Jan 28 00:07:07.347418 kubelet[2911]: I0128 00:07:07.347354 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b8620a33-5232-4037-bf39-b9c55f64a071-flexvol-driver-host\") pod \"calico-node-zkxqr\" (UID: \"b8620a33-5232-4037-bf39-b9c55f64a071\") " pod="calico-system/calico-node-zkxqr" Jan 28 00:07:07.347465 kubelet[2911]: I0128 00:07:07.347391 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b8620a33-5232-4037-bf39-b9c55f64a071-var-lib-calico\") pod \"calico-node-zkxqr\" (UID: \"b8620a33-5232-4037-bf39-b9c55f64a071\") " pod="calico-system/calico-node-zkxqr" Jan 28 00:07:07.347495 kubelet[2911]: I0128 00:07:07.347474 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b8620a33-5232-4037-bf39-b9c55f64a071-cni-net-dir\") pod \"calico-node-zkxqr\" (UID: \"b8620a33-5232-4037-bf39-b9c55f64a071\") " pod="calico-system/calico-node-zkxqr" Jan 28 00:07:07.347517 kubelet[2911]: I0128 00:07:07.347505 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b8620a33-5232-4037-bf39-b9c55f64a071-lib-modules\") pod \"calico-node-zkxqr\" (UID: \"b8620a33-5232-4037-bf39-b9c55f64a071\") " pod="calico-system/calico-node-zkxqr" Jan 28 00:07:07.347537 kubelet[2911]: I0128 00:07:07.347530 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8620a33-5232-4037-bf39-b9c55f64a071-tigera-ca-bundle\") pod \"calico-node-zkxqr\" (UID: \"b8620a33-5232-4037-bf39-b9c55f64a071\") " pod="calico-system/calico-node-zkxqr" Jan 28 00:07:07.347557 kubelet[2911]: I0128 00:07:07.347548 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b8620a33-5232-4037-bf39-b9c55f64a071-cni-bin-dir\") pod \"calico-node-zkxqr\" (UID: \"b8620a33-5232-4037-bf39-b9c55f64a071\") " pod="calico-system/calico-node-zkxqr" Jan 28 00:07:07.347579 kubelet[2911]: I0128 00:07:07.347564 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2cx8\" (UniqueName: \"kubernetes.io/projected/b8620a33-5232-4037-bf39-b9c55f64a071-kube-api-access-d2cx8\") pod \"calico-node-zkxqr\" (UID: \"b8620a33-5232-4037-bf39-b9c55f64a071\") " pod="calico-system/calico-node-zkxqr" Jan 28 00:07:07.347603 kubelet[2911]: I0128 00:07:07.347584 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b8620a33-5232-4037-bf39-b9c55f64a071-node-certs\") pod \"calico-node-zkxqr\" (UID: \"b8620a33-5232-4037-bf39-b9c55f64a071\") " pod="calico-system/calico-node-zkxqr" Jan 28 00:07:07.347623 kubelet[2911]: I0128 00:07:07.347605 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b8620a33-5232-4037-bf39-b9c55f64a071-xtables-lock\") pod \"calico-node-zkxqr\" (UID: \"b8620a33-5232-4037-bf39-b9c55f64a071\") " pod="calico-system/calico-node-zkxqr" Jan 28 00:07:07.352000 audit: BPF prog-id=151 op=LOAD Jan 28 00:07:07.353000 audit: BPF prog-id=152 op=LOAD Jan 28 00:07:07.353000 audit[3357]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=3346 pid=3357 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:07.353000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564343233653765636232393236663933643736323236363537633936 Jan 28 00:07:07.353000 audit: BPF prog-id=152 op=UNLOAD Jan 28 00:07:07.353000 audit[3357]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3346 pid=3357 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:07.353000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564343233653765636232393236663933643736323236363537633936 Jan 28 00:07:07.353000 audit: BPF prog-id=153 op=LOAD Jan 28 00:07:07.353000 audit[3357]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=3346 pid=3357 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:07.353000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564343233653765636232393236663933643736323236363537633936 Jan 28 00:07:07.353000 audit: BPF prog-id=154 op=LOAD Jan 28 00:07:07.353000 audit[3357]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=3346 pid=3357 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:07.353000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564343233653765636232393236663933643736323236363537633936 Jan 28 00:07:07.353000 audit: BPF prog-id=154 op=UNLOAD Jan 28 00:07:07.353000 audit[3357]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3346 pid=3357 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:07.353000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564343233653765636232393236663933643736323236363537633936 Jan 28 00:07:07.353000 audit: BPF prog-id=153 op=UNLOAD Jan 28 00:07:07.353000 audit[3357]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3346 pid=3357 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:07.353000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564343233653765636232393236663933643736323236363537633936 Jan 28 00:07:07.353000 audit: BPF prog-id=155 op=LOAD Jan 28 00:07:07.353000 audit[3357]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=3346 pid=3357 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:07.353000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564343233653765636232393236663933643736323236363537633936 Jan 28 00:07:07.384878 kubelet[2911]: E0128 00:07:07.384831 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:07:07.396354 containerd[1660]: time="2026-01-28T00:07:07.395933090Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7df599b88b-886ps,Uid:09ed7c93-5444-4175-9ad4-0b6ded96e17f,Namespace:calico-system,Attempt:0,} returns sandbox id \"ed423e7ecb2926f93d76226657c96c5a040eba6ac83f6fe55d30d77306aa1ebf\"" Jan 28 00:07:07.398826 containerd[1660]: time="2026-01-28T00:07:07.398802739Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 28 00:07:07.447904 kubelet[2911]: I0128 00:07:07.447850 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/33c364a8-976a-4a66-b401-5e5e3f5c6aca-varrun\") pod \"csi-node-driver-gql7d\" (UID: \"33c364a8-976a-4a66-b401-5e5e3f5c6aca\") " pod="calico-system/csi-node-driver-gql7d" Jan 28 00:07:07.448005 kubelet[2911]: I0128 00:07:07.447930 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33c364a8-976a-4a66-b401-5e5e3f5c6aca-kubelet-dir\") pod \"csi-node-driver-gql7d\" (UID: \"33c364a8-976a-4a66-b401-5e5e3f5c6aca\") " pod="calico-system/csi-node-driver-gql7d" Jan 28 00:07:07.448005 kubelet[2911]: I0128 00:07:07.447959 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/33c364a8-976a-4a66-b401-5e5e3f5c6aca-registration-dir\") pod \"csi-node-driver-gql7d\" (UID: \"33c364a8-976a-4a66-b401-5e5e3f5c6aca\") " pod="calico-system/csi-node-driver-gql7d" Jan 28 00:07:07.448005 kubelet[2911]: I0128 00:07:07.447984 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs5cv\" (UniqueName: \"kubernetes.io/projected/33c364a8-976a-4a66-b401-5e5e3f5c6aca-kube-api-access-qs5cv\") pod \"csi-node-driver-gql7d\" (UID: \"33c364a8-976a-4a66-b401-5e5e3f5c6aca\") " pod="calico-system/csi-node-driver-gql7d" Jan 28 00:07:07.448095 kubelet[2911]: I0128 00:07:07.448018 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/33c364a8-976a-4a66-b401-5e5e3f5c6aca-socket-dir\") pod \"csi-node-driver-gql7d\" (UID: \"33c364a8-976a-4a66-b401-5e5e3f5c6aca\") " pod="calico-system/csi-node-driver-gql7d" Jan 28 00:07:07.449222 kubelet[2911]: E0128 00:07:07.448979 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.449222 kubelet[2911]: W0128 00:07:07.449009 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.449222 kubelet[2911]: E0128 00:07:07.449031 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.449345 kubelet[2911]: E0128 00:07:07.449246 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.449345 kubelet[2911]: W0128 00:07:07.449256 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.449345 kubelet[2911]: E0128 00:07:07.449266 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.450369 kubelet[2911]: E0128 00:07:07.449421 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.450369 kubelet[2911]: W0128 00:07:07.449438 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.450369 kubelet[2911]: E0128 00:07:07.449447 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.450369 kubelet[2911]: E0128 00:07:07.449680 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.450369 kubelet[2911]: W0128 00:07:07.449690 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.450369 kubelet[2911]: E0128 00:07:07.449724 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.450369 kubelet[2911]: E0128 00:07:07.449915 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.450369 kubelet[2911]: W0128 00:07:07.449924 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.450369 kubelet[2911]: E0128 00:07:07.449934 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.450369 kubelet[2911]: E0128 00:07:07.450139 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.451590 kubelet[2911]: W0128 00:07:07.450156 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.451590 kubelet[2911]: E0128 00:07:07.450181 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.451590 kubelet[2911]: E0128 00:07:07.450522 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.451590 kubelet[2911]: W0128 00:07:07.450533 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.451590 kubelet[2911]: E0128 00:07:07.450548 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.451590 kubelet[2911]: E0128 00:07:07.450681 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.451590 kubelet[2911]: W0128 00:07:07.450689 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.451590 kubelet[2911]: E0128 00:07:07.450704 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.453170 kubelet[2911]: E0128 00:07:07.452290 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.453170 kubelet[2911]: W0128 00:07:07.452307 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.453170 kubelet[2911]: E0128 00:07:07.452321 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.453170 kubelet[2911]: E0128 00:07:07.452559 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.453170 kubelet[2911]: W0128 00:07:07.452570 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.453170 kubelet[2911]: E0128 00:07:07.452579 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.453170 kubelet[2911]: E0128 00:07:07.452770 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.453170 kubelet[2911]: W0128 00:07:07.452780 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.453170 kubelet[2911]: E0128 00:07:07.452789 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.453832 kubelet[2911]: E0128 00:07:07.453782 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.453832 kubelet[2911]: W0128 00:07:07.453797 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.453832 kubelet[2911]: E0128 00:07:07.453808 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.464467 kubelet[2911]: E0128 00:07:07.464445 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.464467 kubelet[2911]: W0128 00:07:07.464463 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.464582 kubelet[2911]: E0128 00:07:07.464479 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.505290 containerd[1660]: time="2026-01-28T00:07:07.505256462Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-zkxqr,Uid:b8620a33-5232-4037-bf39-b9c55f64a071,Namespace:calico-system,Attempt:0,}" Jan 28 00:07:07.530789 containerd[1660]: time="2026-01-28T00:07:07.530624539Z" level=info msg="connecting to shim fc73296dd4685b7aa1ce9424a63f9253f963b68db7528ded576520528a139ad6" address="unix:///run/containerd/s/f9638a4259712c3dc8a637119bb69f76cd795252fb61365401e9e8607819e033" namespace=k8s.io protocol=ttrpc version=3 Jan 28 00:07:07.549267 kubelet[2911]: E0128 00:07:07.549240 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.549267 kubelet[2911]: W0128 00:07:07.549261 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.549719 kubelet[2911]: E0128 00:07:07.549279 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.549719 kubelet[2911]: E0128 00:07:07.549475 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.549719 kubelet[2911]: W0128 00:07:07.549484 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.549719 kubelet[2911]: E0128 00:07:07.549492 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.549719 kubelet[2911]: E0128 00:07:07.549700 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.549719 kubelet[2911]: W0128 00:07:07.549715 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.549844 kubelet[2911]: E0128 00:07:07.549727 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.549910 kubelet[2911]: E0128 00:07:07.549893 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.549910 kubelet[2911]: W0128 00:07:07.549905 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.549984 kubelet[2911]: E0128 00:07:07.549914 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.550075 kubelet[2911]: E0128 00:07:07.550062 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.550075 kubelet[2911]: W0128 00:07:07.550073 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.550134 kubelet[2911]: E0128 00:07:07.550081 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.550311 kubelet[2911]: E0128 00:07:07.550296 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.550311 kubelet[2911]: W0128 00:07:07.550311 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.550374 kubelet[2911]: E0128 00:07:07.550320 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.550489 kubelet[2911]: E0128 00:07:07.550478 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.550489 kubelet[2911]: W0128 00:07:07.550488 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.550535 kubelet[2911]: E0128 00:07:07.550497 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.550710 kubelet[2911]: E0128 00:07:07.550696 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.550710 kubelet[2911]: W0128 00:07:07.550710 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.550778 kubelet[2911]: E0128 00:07:07.550719 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.550887 kubelet[2911]: E0128 00:07:07.550875 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.550944 kubelet[2911]: W0128 00:07:07.550887 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.550944 kubelet[2911]: E0128 00:07:07.550896 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.551091 kubelet[2911]: E0128 00:07:07.551077 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.551091 kubelet[2911]: W0128 00:07:07.551090 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.551163 kubelet[2911]: E0128 00:07:07.551098 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.551380 kubelet[2911]: E0128 00:07:07.551365 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.551428 kubelet[2911]: W0128 00:07:07.551385 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.551428 kubelet[2911]: E0128 00:07:07.551397 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.551389 systemd[1]: Started cri-containerd-fc73296dd4685b7aa1ce9424a63f9253f963b68db7528ded576520528a139ad6.scope - libcontainer container fc73296dd4685b7aa1ce9424a63f9253f963b68db7528ded576520528a139ad6. Jan 28 00:07:07.551693 kubelet[2911]: E0128 00:07:07.551555 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.551693 kubelet[2911]: W0128 00:07:07.551563 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.551693 kubelet[2911]: E0128 00:07:07.551571 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.551826 kubelet[2911]: E0128 00:07:07.551808 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.551826 kubelet[2911]: W0128 00:07:07.551825 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.551934 kubelet[2911]: E0128 00:07:07.551835 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.552190 kubelet[2911]: E0128 00:07:07.552170 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.552190 kubelet[2911]: W0128 00:07:07.552186 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.554018 kubelet[2911]: E0128 00:07:07.552197 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.554257 kubelet[2911]: E0128 00:07:07.554242 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.554257 kubelet[2911]: W0128 00:07:07.554257 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.554345 kubelet[2911]: E0128 00:07:07.554269 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.554611 kubelet[2911]: E0128 00:07:07.554598 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.554642 kubelet[2911]: W0128 00:07:07.554612 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.554642 kubelet[2911]: E0128 00:07:07.554622 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.555572 kubelet[2911]: E0128 00:07:07.555546 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.555572 kubelet[2911]: W0128 00:07:07.555566 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.555967 kubelet[2911]: E0128 00:07:07.555583 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.556754 kubelet[2911]: E0128 00:07:07.556222 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.556754 kubelet[2911]: W0128 00:07:07.556236 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.556754 kubelet[2911]: E0128 00:07:07.556249 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.556754 kubelet[2911]: E0128 00:07:07.556420 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.556754 kubelet[2911]: W0128 00:07:07.556429 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.556754 kubelet[2911]: E0128 00:07:07.556438 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.556754 kubelet[2911]: E0128 00:07:07.556553 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.556754 kubelet[2911]: W0128 00:07:07.556560 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.556754 kubelet[2911]: E0128 00:07:07.556567 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.557532 kubelet[2911]: E0128 00:07:07.557506 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.557532 kubelet[2911]: W0128 00:07:07.557524 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.557532 kubelet[2911]: E0128 00:07:07.557538 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.558830 kubelet[2911]: E0128 00:07:07.557804 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.558830 kubelet[2911]: W0128 00:07:07.557822 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.558830 kubelet[2911]: E0128 00:07:07.557834 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.558830 kubelet[2911]: E0128 00:07:07.558068 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.558830 kubelet[2911]: W0128 00:07:07.558077 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.558830 kubelet[2911]: E0128 00:07:07.558088 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.558830 kubelet[2911]: E0128 00:07:07.558337 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.558830 kubelet[2911]: W0128 00:07:07.558348 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.558830 kubelet[2911]: E0128 00:07:07.558357 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.558830 kubelet[2911]: E0128 00:07:07.558538 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.559048 kubelet[2911]: W0128 00:07:07.558546 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.559048 kubelet[2911]: E0128 00:07:07.558554 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.565000 audit: BPF prog-id=156 op=LOAD Jan 28 00:07:07.566000 audit: BPF prog-id=157 op=LOAD Jan 28 00:07:07.566000 audit[3418]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3407 pid=3418 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:07.566000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663373332393664643436383562376161316365393432346136336639 Jan 28 00:07:07.566000 audit: BPF prog-id=157 op=UNLOAD Jan 28 00:07:07.566000 audit[3418]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3407 pid=3418 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:07.566000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663373332393664643436383562376161316365393432346136336639 Jan 28 00:07:07.566000 audit: BPF prog-id=158 op=LOAD Jan 28 00:07:07.566000 audit[3418]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3407 pid=3418 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:07.566000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663373332393664643436383562376161316365393432346136336639 Jan 28 00:07:07.566000 audit: BPF prog-id=159 op=LOAD Jan 28 00:07:07.566000 audit[3418]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3407 pid=3418 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:07.566000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663373332393664643436383562376161316365393432346136336639 Jan 28 00:07:07.566000 audit: BPF prog-id=159 op=UNLOAD Jan 28 00:07:07.566000 audit[3418]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3407 pid=3418 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:07.566000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663373332393664643436383562376161316365393432346136336639 Jan 28 00:07:07.566000 audit: BPF prog-id=158 op=UNLOAD Jan 28 00:07:07.566000 audit[3418]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3407 pid=3418 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:07.566000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663373332393664643436383562376161316365393432346136336639 Jan 28 00:07:07.566000 audit: BPF prog-id=160 op=LOAD Jan 28 00:07:07.566000 audit[3418]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3407 pid=3418 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:07.566000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663373332393664643436383562376161316365393432346136336639 Jan 28 00:07:07.573890 kubelet[2911]: E0128 00:07:07.573855 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:07.573890 kubelet[2911]: W0128 00:07:07.573877 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:07.574044 kubelet[2911]: E0128 00:07:07.573896 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:07.586142 containerd[1660]: time="2026-01-28T00:07:07.586077508Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-zkxqr,Uid:b8620a33-5232-4037-bf39-b9c55f64a071,Namespace:calico-system,Attempt:0,} returns sandbox id \"fc73296dd4685b7aa1ce9424a63f9253f963b68db7528ded576520528a139ad6\"" Jan 28 00:07:08.863735 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount305063836.mount: Deactivated successfully. Jan 28 00:07:08.930862 kubelet[2911]: E0128 00:07:08.930791 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:07:09.711230 containerd[1660]: time="2026-01-28T00:07:09.711146282Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:07:09.712300 containerd[1660]: time="2026-01-28T00:07:09.712261445Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=31716861" Jan 28 00:07:09.712983 containerd[1660]: time="2026-01-28T00:07:09.712962288Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:07:09.715462 containerd[1660]: time="2026-01-28T00:07:09.715163214Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:07:09.716193 containerd[1660]: time="2026-01-28T00:07:09.716162217Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 2.317225758s" Jan 28 00:07:09.716193 containerd[1660]: time="2026-01-28T00:07:09.716190057Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Jan 28 00:07:09.717065 containerd[1660]: time="2026-01-28T00:07:09.717037220Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 28 00:07:09.726779 containerd[1660]: time="2026-01-28T00:07:09.726742769Z" level=info msg="CreateContainer within sandbox \"ed423e7ecb2926f93d76226657c96c5a040eba6ac83f6fe55d30d77306aa1ebf\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 28 00:07:09.734474 containerd[1660]: time="2026-01-28T00:07:09.734439113Z" level=info msg="Container 8fea6ca7abac0bd866b42ac3eb81775d6ae3de7192abfc85686e96b7fb08a7be: CDI devices from CRI Config.CDIDevices: []" Jan 28 00:07:09.741956 containerd[1660]: time="2026-01-28T00:07:09.741924695Z" level=info msg="CreateContainer within sandbox \"ed423e7ecb2926f93d76226657c96c5a040eba6ac83f6fe55d30d77306aa1ebf\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"8fea6ca7abac0bd866b42ac3eb81775d6ae3de7192abfc85686e96b7fb08a7be\"" Jan 28 00:07:09.742742 containerd[1660]: time="2026-01-28T00:07:09.742410897Z" level=info msg="StartContainer for \"8fea6ca7abac0bd866b42ac3eb81775d6ae3de7192abfc85686e96b7fb08a7be\"" Jan 28 00:07:09.743780 containerd[1660]: time="2026-01-28T00:07:09.743696501Z" level=info msg="connecting to shim 8fea6ca7abac0bd866b42ac3eb81775d6ae3de7192abfc85686e96b7fb08a7be" address="unix:///run/containerd/s/3b669c439da291e767c57736029eae72b40a1e26994de3e801520532bc2a5060" protocol=ttrpc version=3 Jan 28 00:07:09.765534 systemd[1]: Started cri-containerd-8fea6ca7abac0bd866b42ac3eb81775d6ae3de7192abfc85686e96b7fb08a7be.scope - libcontainer container 8fea6ca7abac0bd866b42ac3eb81775d6ae3de7192abfc85686e96b7fb08a7be. Jan 28 00:07:09.775000 audit: BPF prog-id=161 op=LOAD Jan 28 00:07:09.775000 audit: BPF prog-id=162 op=LOAD Jan 28 00:07:09.775000 audit[3478]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3346 pid=3478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:09.775000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866656136636137616261633062643836366234326163336562383137 Jan 28 00:07:09.775000 audit: BPF prog-id=162 op=UNLOAD Jan 28 00:07:09.775000 audit[3478]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3346 pid=3478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:09.775000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866656136636137616261633062643836366234326163336562383137 Jan 28 00:07:09.775000 audit: BPF prog-id=163 op=LOAD Jan 28 00:07:09.775000 audit[3478]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3346 pid=3478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:09.775000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866656136636137616261633062643836366234326163336562383137 Jan 28 00:07:09.775000 audit: BPF prog-id=164 op=LOAD Jan 28 00:07:09.775000 audit[3478]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3346 pid=3478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:09.775000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866656136636137616261633062643836366234326163336562383137 Jan 28 00:07:09.775000 audit: BPF prog-id=164 op=UNLOAD Jan 28 00:07:09.775000 audit[3478]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3346 pid=3478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:09.775000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866656136636137616261633062643836366234326163336562383137 Jan 28 00:07:09.775000 audit: BPF prog-id=163 op=UNLOAD Jan 28 00:07:09.775000 audit[3478]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3346 pid=3478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:09.775000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866656136636137616261633062643836366234326163336562383137 Jan 28 00:07:09.775000 audit: BPF prog-id=165 op=LOAD Jan 28 00:07:09.775000 audit[3478]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3346 pid=3478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:09.775000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3866656136636137616261633062643836366234326163336562383137 Jan 28 00:07:09.799015 containerd[1660]: time="2026-01-28T00:07:09.798980149Z" level=info msg="StartContainer for \"8fea6ca7abac0bd866b42ac3eb81775d6ae3de7192abfc85686e96b7fb08a7be\" returns successfully" Jan 28 00:07:10.062595 kubelet[2911]: E0128 00:07:10.062506 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.062595 kubelet[2911]: W0128 00:07:10.062532 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.062595 kubelet[2911]: E0128 00:07:10.062554 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.062988 kubelet[2911]: E0128 00:07:10.062780 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.062988 kubelet[2911]: W0128 00:07:10.062791 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.062988 kubelet[2911]: E0128 00:07:10.062867 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.063362 kubelet[2911]: E0128 00:07:10.063341 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.063362 kubelet[2911]: W0128 00:07:10.063355 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.063362 kubelet[2911]: E0128 00:07:10.063367 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.063562 kubelet[2911]: E0128 00:07:10.063513 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.063562 kubelet[2911]: W0128 00:07:10.063520 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.063562 kubelet[2911]: E0128 00:07:10.063529 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.064106 kubelet[2911]: E0128 00:07:10.063655 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.064106 kubelet[2911]: W0128 00:07:10.063665 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.064106 kubelet[2911]: E0128 00:07:10.063673 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.064106 kubelet[2911]: E0128 00:07:10.063783 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.064106 kubelet[2911]: W0128 00:07:10.063789 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.064106 kubelet[2911]: E0128 00:07:10.063816 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.064106 kubelet[2911]: E0128 00:07:10.063931 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.064106 kubelet[2911]: W0128 00:07:10.063938 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.064106 kubelet[2911]: E0128 00:07:10.063944 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.064106 kubelet[2911]: E0128 00:07:10.064055 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.064420 kubelet[2911]: W0128 00:07:10.064062 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.064420 kubelet[2911]: E0128 00:07:10.064071 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.064420 kubelet[2911]: E0128 00:07:10.064191 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.064420 kubelet[2911]: W0128 00:07:10.064197 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.064420 kubelet[2911]: E0128 00:07:10.064214 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.064420 kubelet[2911]: E0128 00:07:10.064350 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.064420 kubelet[2911]: W0128 00:07:10.064357 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.064420 kubelet[2911]: E0128 00:07:10.064364 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.064572 kubelet[2911]: E0128 00:07:10.064497 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.064572 kubelet[2911]: W0128 00:07:10.064506 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.064572 kubelet[2911]: E0128 00:07:10.064514 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.064644 kubelet[2911]: E0128 00:07:10.064638 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.064704 kubelet[2911]: W0128 00:07:10.064645 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.064704 kubelet[2911]: E0128 00:07:10.064653 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.065050 kubelet[2911]: E0128 00:07:10.065029 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.065050 kubelet[2911]: W0128 00:07:10.065041 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.065050 kubelet[2911]: E0128 00:07:10.065049 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.065220 kubelet[2911]: E0128 00:07:10.065189 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.065220 kubelet[2911]: W0128 00:07:10.065198 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.065278 kubelet[2911]: E0128 00:07:10.065223 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.065362 kubelet[2911]: E0128 00:07:10.065350 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.065362 kubelet[2911]: W0128 00:07:10.065359 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.065411 kubelet[2911]: E0128 00:07:10.065366 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.068087 kubelet[2911]: E0128 00:07:10.068060 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.068087 kubelet[2911]: W0128 00:07:10.068078 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.068087 kubelet[2911]: E0128 00:07:10.068091 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.068282 kubelet[2911]: E0128 00:07:10.068270 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.068282 kubelet[2911]: W0128 00:07:10.068280 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.068358 kubelet[2911]: E0128 00:07:10.068290 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.068965 kubelet[2911]: E0128 00:07:10.068948 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.068965 kubelet[2911]: W0128 00:07:10.068964 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.069020 kubelet[2911]: E0128 00:07:10.068976 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.069954 kubelet[2911]: E0128 00:07:10.069937 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.069954 kubelet[2911]: W0128 00:07:10.069952 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.070049 kubelet[2911]: E0128 00:07:10.069967 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.070220 kubelet[2911]: E0128 00:07:10.070183 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.070220 kubelet[2911]: W0128 00:07:10.070197 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.070283 kubelet[2911]: E0128 00:07:10.070260 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.070884 kubelet[2911]: E0128 00:07:10.070865 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.071269 kubelet[2911]: W0128 00:07:10.071250 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.071313 kubelet[2911]: E0128 00:07:10.071274 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.071471 kubelet[2911]: E0128 00:07:10.071458 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.071501 kubelet[2911]: W0128 00:07:10.071471 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.071501 kubelet[2911]: E0128 00:07:10.071481 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.071622 kubelet[2911]: E0128 00:07:10.071611 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.071622 kubelet[2911]: W0128 00:07:10.071621 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.071681 kubelet[2911]: E0128 00:07:10.071629 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.071761 kubelet[2911]: E0128 00:07:10.071752 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.071761 kubelet[2911]: W0128 00:07:10.071761 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.071820 kubelet[2911]: E0128 00:07:10.071768 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.071893 kubelet[2911]: E0128 00:07:10.071881 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.071893 kubelet[2911]: W0128 00:07:10.071891 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.071974 kubelet[2911]: E0128 00:07:10.071898 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.072049 kubelet[2911]: E0128 00:07:10.072039 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.072049 kubelet[2911]: W0128 00:07:10.072048 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.072092 kubelet[2911]: E0128 00:07:10.072056 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.072499 kubelet[2911]: E0128 00:07:10.072427 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.072499 kubelet[2911]: W0128 00:07:10.072444 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.072499 kubelet[2911]: E0128 00:07:10.072455 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.072857 kubelet[2911]: E0128 00:07:10.072769 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.072857 kubelet[2911]: W0128 00:07:10.072781 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.072857 kubelet[2911]: E0128 00:07:10.072791 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.073091 kubelet[2911]: E0128 00:07:10.073078 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.073217 kubelet[2911]: W0128 00:07:10.073145 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.073217 kubelet[2911]: E0128 00:07:10.073159 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.073461 kubelet[2911]: E0128 00:07:10.073450 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.073611 kubelet[2911]: W0128 00:07:10.073509 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.073611 kubelet[2911]: E0128 00:07:10.073524 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.073769 kubelet[2911]: E0128 00:07:10.073758 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.073819 kubelet[2911]: W0128 00:07:10.073808 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.073870 kubelet[2911]: E0128 00:07:10.073860 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.074193 kubelet[2911]: E0128 00:07:10.074075 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.074193 kubelet[2911]: W0128 00:07:10.074085 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.074193 kubelet[2911]: E0128 00:07:10.074094 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.074393 kubelet[2911]: E0128 00:07:10.074381 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:10.074521 kubelet[2911]: W0128 00:07:10.074507 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:10.074613 kubelet[2911]: E0128 00:07:10.074596 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:10.928925 kubelet[2911]: E0128 00:07:10.928514 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:07:10.993343 kubelet[2911]: I0128 00:07:10.993297 2911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 00:07:11.072354 kubelet[2911]: E0128 00:07:11.072310 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.072354 kubelet[2911]: W0128 00:07:11.072334 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.072354 kubelet[2911]: E0128 00:07:11.072353 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.072716 kubelet[2911]: E0128 00:07:11.072505 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.072716 kubelet[2911]: W0128 00:07:11.072511 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.072716 kubelet[2911]: E0128 00:07:11.072555 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.072716 kubelet[2911]: E0128 00:07:11.072717 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.072794 kubelet[2911]: W0128 00:07:11.072723 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.072794 kubelet[2911]: E0128 00:07:11.072731 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.072870 kubelet[2911]: E0128 00:07:11.072851 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.072870 kubelet[2911]: W0128 00:07:11.072861 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.072870 kubelet[2911]: E0128 00:07:11.072869 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.073022 kubelet[2911]: E0128 00:07:11.072993 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.073022 kubelet[2911]: W0128 00:07:11.073004 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.073022 kubelet[2911]: E0128 00:07:11.073012 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.073128 kubelet[2911]: E0128 00:07:11.073119 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.073128 kubelet[2911]: W0128 00:07:11.073127 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.073175 kubelet[2911]: E0128 00:07:11.073135 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.073284 kubelet[2911]: E0128 00:07:11.073259 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.073284 kubelet[2911]: W0128 00:07:11.073275 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.073344 kubelet[2911]: E0128 00:07:11.073283 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.073431 kubelet[2911]: E0128 00:07:11.073405 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.073431 kubelet[2911]: W0128 00:07:11.073415 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.073431 kubelet[2911]: E0128 00:07:11.073422 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.073735 kubelet[2911]: E0128 00:07:11.073701 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.073735 kubelet[2911]: W0128 00:07:11.073721 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.073735 kubelet[2911]: E0128 00:07:11.073733 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.073935 kubelet[2911]: E0128 00:07:11.073873 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.073935 kubelet[2911]: W0128 00:07:11.073922 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.073935 kubelet[2911]: E0128 00:07:11.073933 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.074077 kubelet[2911]: E0128 00:07:11.074065 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.074077 kubelet[2911]: W0128 00:07:11.074075 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.074122 kubelet[2911]: E0128 00:07:11.074085 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.074225 kubelet[2911]: E0128 00:07:11.074214 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.074256 kubelet[2911]: W0128 00:07:11.074224 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.074256 kubelet[2911]: E0128 00:07:11.074233 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.075888 kubelet[2911]: E0128 00:07:11.075843 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.075888 kubelet[2911]: W0128 00:07:11.075864 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.075888 kubelet[2911]: E0128 00:07:11.075877 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.076085 kubelet[2911]: E0128 00:07:11.076051 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.076085 kubelet[2911]: W0128 00:07:11.076064 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.076085 kubelet[2911]: E0128 00:07:11.076075 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.076364 kubelet[2911]: E0128 00:07:11.076338 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.076364 kubelet[2911]: W0128 00:07:11.076355 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.076412 kubelet[2911]: E0128 00:07:11.076366 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.076703 kubelet[2911]: E0128 00:07:11.076686 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.076703 kubelet[2911]: W0128 00:07:11.076699 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.076770 kubelet[2911]: E0128 00:07:11.076710 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.076891 kubelet[2911]: E0128 00:07:11.076877 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.076891 kubelet[2911]: W0128 00:07:11.076888 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.076944 kubelet[2911]: E0128 00:07:11.076896 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.077115 kubelet[2911]: E0128 00:07:11.077096 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.077159 kubelet[2911]: W0128 00:07:11.077115 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.077159 kubelet[2911]: E0128 00:07:11.077129 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.077326 kubelet[2911]: E0128 00:07:11.077315 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.077358 kubelet[2911]: W0128 00:07:11.077326 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.077358 kubelet[2911]: E0128 00:07:11.077335 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.077494 kubelet[2911]: E0128 00:07:11.077482 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.077494 kubelet[2911]: W0128 00:07:11.077494 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.077553 kubelet[2911]: E0128 00:07:11.077502 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.077747 kubelet[2911]: E0128 00:07:11.077729 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.077747 kubelet[2911]: W0128 00:07:11.077741 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.077809 kubelet[2911]: E0128 00:07:11.077752 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.078098 kubelet[2911]: E0128 00:07:11.078050 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.078098 kubelet[2911]: W0128 00:07:11.078062 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.078318 kubelet[2911]: E0128 00:07:11.078092 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.078490 kubelet[2911]: E0128 00:07:11.078415 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.078530 kubelet[2911]: W0128 00:07:11.078427 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.078530 kubelet[2911]: E0128 00:07:11.078521 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.078715 kubelet[2911]: E0128 00:07:11.078703 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.078715 kubelet[2911]: W0128 00:07:11.078714 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.078774 kubelet[2911]: E0128 00:07:11.078724 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.078871 kubelet[2911]: E0128 00:07:11.078860 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.078871 kubelet[2911]: W0128 00:07:11.078869 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.078937 kubelet[2911]: E0128 00:07:11.078876 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.079124 kubelet[2911]: E0128 00:07:11.079055 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.079124 kubelet[2911]: W0128 00:07:11.079114 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.079303 kubelet[2911]: E0128 00:07:11.079128 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.079828 kubelet[2911]: E0128 00:07:11.079772 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.079828 kubelet[2911]: W0128 00:07:11.079827 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.079895 kubelet[2911]: E0128 00:07:11.079841 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.080122 kubelet[2911]: E0128 00:07:11.080084 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.080122 kubelet[2911]: W0128 00:07:11.080099 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.080122 kubelet[2911]: E0128 00:07:11.080109 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.080346 kubelet[2911]: E0128 00:07:11.080332 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.080383 kubelet[2911]: W0128 00:07:11.080358 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.080383 kubelet[2911]: E0128 00:07:11.080369 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.080600 kubelet[2911]: E0128 00:07:11.080584 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.080600 kubelet[2911]: W0128 00:07:11.080598 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.080680 kubelet[2911]: E0128 00:07:11.080608 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.080821 kubelet[2911]: E0128 00:07:11.080805 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.080821 kubelet[2911]: W0128 00:07:11.080819 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.080866 kubelet[2911]: E0128 00:07:11.080829 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.081156 kubelet[2911]: E0128 00:07:11.081138 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.081156 kubelet[2911]: W0128 00:07:11.081153 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.081245 kubelet[2911]: E0128 00:07:11.081164 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.082099 kubelet[2911]: E0128 00:07:11.082021 2911 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 28 00:07:11.082099 kubelet[2911]: W0128 00:07:11.082037 2911 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 28 00:07:11.082099 kubelet[2911]: E0128 00:07:11.082049 2911 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 28 00:07:11.218927 containerd[1660]: time="2026-01-28T00:07:11.218830221Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:07:11.220893 containerd[1660]: time="2026-01-28T00:07:11.220838507Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 28 00:07:11.221044 containerd[1660]: time="2026-01-28T00:07:11.220921268Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:07:11.222980 containerd[1660]: time="2026-01-28T00:07:11.222935034Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:07:11.223432 containerd[1660]: time="2026-01-28T00:07:11.223399955Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.506330975s" Jan 28 00:07:11.223474 containerd[1660]: time="2026-01-28T00:07:11.223433315Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Jan 28 00:07:11.228070 containerd[1660]: time="2026-01-28T00:07:11.228037409Z" level=info msg="CreateContainer within sandbox \"fc73296dd4685b7aa1ce9424a63f9253f963b68db7528ded576520528a139ad6\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 28 00:07:11.239802 containerd[1660]: time="2026-01-28T00:07:11.239352044Z" level=info msg="Container 665988b98bba27c2dea84881b371dbfb9057f4547426c7484ffcd8bf87fe5ab7: CDI devices from CRI Config.CDIDevices: []" Jan 28 00:07:11.250665 containerd[1660]: time="2026-01-28T00:07:11.250624038Z" level=info msg="CreateContainer within sandbox \"fc73296dd4685b7aa1ce9424a63f9253f963b68db7528ded576520528a139ad6\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"665988b98bba27c2dea84881b371dbfb9057f4547426c7484ffcd8bf87fe5ab7\"" Jan 28 00:07:11.251457 containerd[1660]: time="2026-01-28T00:07:11.251431240Z" level=info msg="StartContainer for \"665988b98bba27c2dea84881b371dbfb9057f4547426c7484ffcd8bf87fe5ab7\"" Jan 28 00:07:11.252795 containerd[1660]: time="2026-01-28T00:07:11.252772284Z" level=info msg="connecting to shim 665988b98bba27c2dea84881b371dbfb9057f4547426c7484ffcd8bf87fe5ab7" address="unix:///run/containerd/s/f9638a4259712c3dc8a637119bb69f76cd795252fb61365401e9e8607819e033" protocol=ttrpc version=3 Jan 28 00:07:11.273375 systemd[1]: Started cri-containerd-665988b98bba27c2dea84881b371dbfb9057f4547426c7484ffcd8bf87fe5ab7.scope - libcontainer container 665988b98bba27c2dea84881b371dbfb9057f4547426c7484ffcd8bf87fe5ab7. Jan 28 00:07:11.326000 audit: BPF prog-id=166 op=LOAD Jan 28 00:07:11.328257 kernel: kauditd_printk_skb: 80 callbacks suppressed Jan 28 00:07:11.328293 kernel: audit: type=1334 audit(1769558831.326:560): prog-id=166 op=LOAD Jan 28 00:07:11.326000 audit[3591]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3407 pid=3591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:11.332390 kernel: audit: type=1300 audit(1769558831.326:560): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3407 pid=3591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:11.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636353938386239386262613237633264656138343838316233373164 Jan 28 00:07:11.336519 kernel: audit: type=1327 audit(1769558831.326:560): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636353938386239386262613237633264656138343838316233373164 Jan 28 00:07:11.336689 kernel: audit: type=1334 audit(1769558831.327:561): prog-id=167 op=LOAD Jan 28 00:07:11.327000 audit: BPF prog-id=167 op=LOAD Jan 28 00:07:11.327000 audit[3591]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3407 pid=3591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:11.340588 kernel: audit: type=1300 audit(1769558831.327:561): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3407 pid=3591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:11.327000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636353938386239386262613237633264656138343838316233373164 Jan 28 00:07:11.344395 kernel: audit: type=1327 audit(1769558831.327:561): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636353938386239386262613237633264656138343838316233373164 Jan 28 00:07:11.344534 kernel: audit: type=1334 audit(1769558831.327:562): prog-id=167 op=UNLOAD Jan 28 00:07:11.327000 audit: BPF prog-id=167 op=UNLOAD Jan 28 00:07:11.327000 audit[3591]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3407 pid=3591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:11.348487 kernel: audit: type=1300 audit(1769558831.327:562): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3407 pid=3591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:11.327000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636353938386239386262613237633264656138343838316233373164 Jan 28 00:07:11.351889 kernel: audit: type=1327 audit(1769558831.327:562): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636353938386239386262613237633264656138343838316233373164 Jan 28 00:07:11.351939 kernel: audit: type=1334 audit(1769558831.327:563): prog-id=166 op=UNLOAD Jan 28 00:07:11.327000 audit: BPF prog-id=166 op=UNLOAD Jan 28 00:07:11.327000 audit[3591]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3407 pid=3591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:11.327000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636353938386239386262613237633264656138343838316233373164 Jan 28 00:07:11.327000 audit: BPF prog-id=168 op=LOAD Jan 28 00:07:11.327000 audit[3591]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3407 pid=3591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:11.327000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636353938386239386262613237633264656138343838316233373164 Jan 28 00:07:11.367042 containerd[1660]: time="2026-01-28T00:07:11.367011111Z" level=info msg="StartContainer for \"665988b98bba27c2dea84881b371dbfb9057f4547426c7484ffcd8bf87fe5ab7\" returns successfully" Jan 28 00:07:11.379854 systemd[1]: cri-containerd-665988b98bba27c2dea84881b371dbfb9057f4547426c7484ffcd8bf87fe5ab7.scope: Deactivated successfully. Jan 28 00:07:11.383504 containerd[1660]: time="2026-01-28T00:07:11.383441761Z" level=info msg="received container exit event container_id:\"665988b98bba27c2dea84881b371dbfb9057f4547426c7484ffcd8bf87fe5ab7\" id:\"665988b98bba27c2dea84881b371dbfb9057f4547426c7484ffcd8bf87fe5ab7\" pid:3605 exited_at:{seconds:1769558831 nanos:383114280}" Jan 28 00:07:11.383000 audit: BPF prog-id=168 op=UNLOAD Jan 28 00:07:11.402698 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-665988b98bba27c2dea84881b371dbfb9057f4547426c7484ffcd8bf87fe5ab7-rootfs.mount: Deactivated successfully. Jan 28 00:07:11.998390 containerd[1660]: time="2026-01-28T00:07:11.998345869Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 28 00:07:12.015784 kubelet[2911]: I0128 00:07:12.015719 2911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7df599b88b-886ps" podStartSLOduration=3.69728256 podStartE2EDuration="6.015702882s" podCreationTimestamp="2026-01-28 00:07:06 +0000 UTC" firstStartedPulling="2026-01-28 00:07:07.398506578 +0000 UTC m=+22.557682835" lastFinishedPulling="2026-01-28 00:07:09.71692686 +0000 UTC m=+24.876103157" observedRunningTime="2026-01-28 00:07:10.007135781 +0000 UTC m=+25.166312038" watchObservedRunningTime="2026-01-28 00:07:12.015702882 +0000 UTC m=+27.174879139" Jan 28 00:07:12.929116 kubelet[2911]: E0128 00:07:12.929003 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:07:14.930455 kubelet[2911]: E0128 00:07:14.930421 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:07:15.375609 containerd[1660]: time="2026-01-28T00:07:15.375535847Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:07:15.377332 containerd[1660]: time="2026-01-28T00:07:15.377124691Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65921280" Jan 28 00:07:15.387244 containerd[1660]: time="2026-01-28T00:07:15.387201842Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:07:15.389830 containerd[1660]: time="2026-01-28T00:07:15.389796530Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:07:15.390880 containerd[1660]: time="2026-01-28T00:07:15.390776773Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 3.392394504s" Jan 28 00:07:15.390880 containerd[1660]: time="2026-01-28T00:07:15.390803213Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Jan 28 00:07:15.395888 containerd[1660]: time="2026-01-28T00:07:15.395849868Z" level=info msg="CreateContainer within sandbox \"fc73296dd4685b7aa1ce9424a63f9253f963b68db7528ded576520528a139ad6\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 28 00:07:15.406651 containerd[1660]: time="2026-01-28T00:07:15.406585341Z" level=info msg="Container f43ff44ed3edff4cebf4e098303abe22aa69f1b1509c3ce2173856ab5da003dc: CDI devices from CRI Config.CDIDevices: []" Jan 28 00:07:15.408247 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4111014464.mount: Deactivated successfully. Jan 28 00:07:15.418891 containerd[1660]: time="2026-01-28T00:07:15.418849058Z" level=info msg="CreateContainer within sandbox \"fc73296dd4685b7aa1ce9424a63f9253f963b68db7528ded576520528a139ad6\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"f43ff44ed3edff4cebf4e098303abe22aa69f1b1509c3ce2173856ab5da003dc\"" Jan 28 00:07:15.419407 containerd[1660]: time="2026-01-28T00:07:15.419367940Z" level=info msg="StartContainer for \"f43ff44ed3edff4cebf4e098303abe22aa69f1b1509c3ce2173856ab5da003dc\"" Jan 28 00:07:15.423641 containerd[1660]: time="2026-01-28T00:07:15.423584713Z" level=info msg="connecting to shim f43ff44ed3edff4cebf4e098303abe22aa69f1b1509c3ce2173856ab5da003dc" address="unix:///run/containerd/s/f9638a4259712c3dc8a637119bb69f76cd795252fb61365401e9e8607819e033" protocol=ttrpc version=3 Jan 28 00:07:15.446366 systemd[1]: Started cri-containerd-f43ff44ed3edff4cebf4e098303abe22aa69f1b1509c3ce2173856ab5da003dc.scope - libcontainer container f43ff44ed3edff4cebf4e098303abe22aa69f1b1509c3ce2173856ab5da003dc. Jan 28 00:07:15.494000 audit: BPF prog-id=169 op=LOAD Jan 28 00:07:15.494000 audit[3654]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3407 pid=3654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:15.494000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634336666343465643365646666346365626634653039383330336162 Jan 28 00:07:15.494000 audit: BPF prog-id=170 op=LOAD Jan 28 00:07:15.494000 audit[3654]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3407 pid=3654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:15.494000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634336666343465643365646666346365626634653039383330336162 Jan 28 00:07:15.494000 audit: BPF prog-id=170 op=UNLOAD Jan 28 00:07:15.494000 audit[3654]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3407 pid=3654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:15.494000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634336666343465643365646666346365626634653039383330336162 Jan 28 00:07:15.494000 audit: BPF prog-id=169 op=UNLOAD Jan 28 00:07:15.494000 audit[3654]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3407 pid=3654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:15.494000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634336666343465643365646666346365626634653039383330336162 Jan 28 00:07:15.495000 audit: BPF prog-id=171 op=LOAD Jan 28 00:07:15.495000 audit[3654]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3407 pid=3654 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:15.495000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634336666343465643365646666346365626634653039383330336162 Jan 28 00:07:15.516326 containerd[1660]: time="2026-01-28T00:07:15.516279474Z" level=info msg="StartContainer for \"f43ff44ed3edff4cebf4e098303abe22aa69f1b1509c3ce2173856ab5da003dc\" returns successfully" Jan 28 00:07:15.959175 containerd[1660]: time="2026-01-28T00:07:15.959112939Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 28 00:07:15.961112 systemd[1]: cri-containerd-f43ff44ed3edff4cebf4e098303abe22aa69f1b1509c3ce2173856ab5da003dc.scope: Deactivated successfully. Jan 28 00:07:15.961609 systemd[1]: cri-containerd-f43ff44ed3edff4cebf4e098303abe22aa69f1b1509c3ce2173856ab5da003dc.scope: Consumed 459ms CPU time, 189M memory peak, 165.9M written to disk. Jan 28 00:07:15.963669 containerd[1660]: time="2026-01-28T00:07:15.963636273Z" level=info msg="received container exit event container_id:\"f43ff44ed3edff4cebf4e098303abe22aa69f1b1509c3ce2173856ab5da003dc\" id:\"f43ff44ed3edff4cebf4e098303abe22aa69f1b1509c3ce2173856ab5da003dc\" pid:3667 exited_at:{seconds:1769558835 nanos:963403392}" Jan 28 00:07:15.966000 audit: BPF prog-id=171 op=UNLOAD Jan 28 00:07:15.981918 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f43ff44ed3edff4cebf4e098303abe22aa69f1b1509c3ce2173856ab5da003dc-rootfs.mount: Deactivated successfully. Jan 28 00:07:16.054977 kubelet[2911]: I0128 00:07:16.054924 2911 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 28 00:07:16.094970 systemd[1]: Created slice kubepods-besteffort-podc30f0bb9_9acc_4a29_aba6_a8788797d9e8.slice - libcontainer container kubepods-besteffort-podc30f0bb9_9acc_4a29_aba6_a8788797d9e8.slice. Jan 28 00:07:16.102637 systemd[1]: Created slice kubepods-burstable-pode8f7a78c_8d11_4e0b_a34b_f3ae93c9f74b.slice - libcontainer container kubepods-burstable-pode8f7a78c_8d11_4e0b_a34b_f3ae93c9f74b.slice. Jan 28 00:07:16.111961 systemd[1]: Created slice kubepods-besteffort-pod7d3bf2f0_73b0_413a_909b_a8ef20ea0438.slice - libcontainer container kubepods-besteffort-pod7d3bf2f0_73b0_413a_909b_a8ef20ea0438.slice. Jan 28 00:07:16.112344 kubelet[2911]: I0128 00:07:16.112090 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/6b65da67-3308-40ad-9789-eac2c3fb395b-whisker-backend-key-pair\") pod \"whisker-6f4b7bc95d-nndks\" (UID: \"6b65da67-3308-40ad-9789-eac2c3fb395b\") " pod="calico-system/whisker-6f4b7bc95d-nndks" Jan 28 00:07:16.112344 kubelet[2911]: I0128 00:07:16.112129 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b65da67-3308-40ad-9789-eac2c3fb395b-whisker-ca-bundle\") pod \"whisker-6f4b7bc95d-nndks\" (UID: \"6b65da67-3308-40ad-9789-eac2c3fb395b\") " pod="calico-system/whisker-6f4b7bc95d-nndks" Jan 28 00:07:16.112344 kubelet[2911]: I0128 00:07:16.112148 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlrrv\" (UniqueName: \"kubernetes.io/projected/6b65da67-3308-40ad-9789-eac2c3fb395b-kube-api-access-mlrrv\") pod \"whisker-6f4b7bc95d-nndks\" (UID: \"6b65da67-3308-40ad-9789-eac2c3fb395b\") " pod="calico-system/whisker-6f4b7bc95d-nndks" Jan 28 00:07:16.112344 kubelet[2911]: I0128 00:07:16.112165 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d3bf2f0-73b0-413a-909b-a8ef20ea0438-tigera-ca-bundle\") pod \"calico-kube-controllers-579c57b986-v9zhd\" (UID: \"7d3bf2f0-73b0-413a-909b-a8ef20ea0438\") " pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" Jan 28 00:07:16.112563 kubelet[2911]: I0128 00:07:16.112437 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvvxj\" (UniqueName: \"kubernetes.io/projected/c30f0bb9-9acc-4a29-aba6-a8788797d9e8-kube-api-access-lvvxj\") pod \"calico-apiserver-564d6bbdf8-z4ccx\" (UID: \"c30f0bb9-9acc-4a29-aba6-a8788797d9e8\") " pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" Jan 28 00:07:16.112608 kubelet[2911]: I0128 00:07:16.112584 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2wq7\" (UniqueName: \"kubernetes.io/projected/88283619-c3e2-4b73-b5a3-401f940acb8e-kube-api-access-s2wq7\") pod \"coredns-674b8bbfcf-6kv44\" (UID: \"88283619-c3e2-4b73-b5a3-401f940acb8e\") " pod="kube-system/coredns-674b8bbfcf-6kv44" Jan 28 00:07:16.112822 kubelet[2911]: I0128 00:07:16.112618 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c30f0bb9-9acc-4a29-aba6-a8788797d9e8-calico-apiserver-certs\") pod \"calico-apiserver-564d6bbdf8-z4ccx\" (UID: \"c30f0bb9-9acc-4a29-aba6-a8788797d9e8\") " pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" Jan 28 00:07:16.112822 kubelet[2911]: I0128 00:07:16.112786 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkhp5\" (UniqueName: \"kubernetes.io/projected/7d3bf2f0-73b0-413a-909b-a8ef20ea0438-kube-api-access-nkhp5\") pod \"calico-kube-controllers-579c57b986-v9zhd\" (UID: \"7d3bf2f0-73b0-413a-909b-a8ef20ea0438\") " pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" Jan 28 00:07:16.112903 kubelet[2911]: I0128 00:07:16.112836 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0bfcf205-da0c-44de-9408-ba1ddb5cd934-calico-apiserver-certs\") pod \"calico-apiserver-564d6bbdf8-pmnrt\" (UID: \"0bfcf205-da0c-44de-9408-ba1ddb5cd934\") " pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" Jan 28 00:07:16.112903 kubelet[2911]: I0128 00:07:16.112865 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj8fj\" (UniqueName: \"kubernetes.io/projected/0bfcf205-da0c-44de-9408-ba1ddb5cd934-kube-api-access-vj8fj\") pod \"calico-apiserver-564d6bbdf8-pmnrt\" (UID: \"0bfcf205-da0c-44de-9408-ba1ddb5cd934\") " pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" Jan 28 00:07:16.113101 kubelet[2911]: I0128 00:07:16.113075 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67g6t\" (UniqueName: \"kubernetes.io/projected/e8f7a78c-8d11-4e0b-a34b-f3ae93c9f74b-kube-api-access-67g6t\") pod \"coredns-674b8bbfcf-vcgk6\" (UID: \"e8f7a78c-8d11-4e0b-a34b-f3ae93c9f74b\") " pod="kube-system/coredns-674b8bbfcf-vcgk6" Jan 28 00:07:16.113143 kubelet[2911]: I0128 00:07:16.113106 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/115118f6-6ada-4444-93ef-3a99eaaacd44-config\") pod \"goldmane-666569f655-4g849\" (UID: \"115118f6-6ada-4444-93ef-3a99eaaacd44\") " pod="calico-system/goldmane-666569f655-4g849" Jan 28 00:07:16.113628 kubelet[2911]: I0128 00:07:16.113235 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/115118f6-6ada-4444-93ef-3a99eaaacd44-goldmane-ca-bundle\") pod \"goldmane-666569f655-4g849\" (UID: \"115118f6-6ada-4444-93ef-3a99eaaacd44\") " pod="calico-system/goldmane-666569f655-4g849" Jan 28 00:07:16.113628 kubelet[2911]: I0128 00:07:16.113266 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/115118f6-6ada-4444-93ef-3a99eaaacd44-goldmane-key-pair\") pod \"goldmane-666569f655-4g849\" (UID: \"115118f6-6ada-4444-93ef-3a99eaaacd44\") " pod="calico-system/goldmane-666569f655-4g849" Jan 28 00:07:16.113628 kubelet[2911]: I0128 00:07:16.113330 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8f7a78c-8d11-4e0b-a34b-f3ae93c9f74b-config-volume\") pod \"coredns-674b8bbfcf-vcgk6\" (UID: \"e8f7a78c-8d11-4e0b-a34b-f3ae93c9f74b\") " pod="kube-system/coredns-674b8bbfcf-vcgk6" Jan 28 00:07:16.113628 kubelet[2911]: I0128 00:07:16.113350 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh76x\" (UniqueName: \"kubernetes.io/projected/115118f6-6ada-4444-93ef-3a99eaaacd44-kube-api-access-dh76x\") pod \"goldmane-666569f655-4g849\" (UID: \"115118f6-6ada-4444-93ef-3a99eaaacd44\") " pod="calico-system/goldmane-666569f655-4g849" Jan 28 00:07:16.113628 kubelet[2911]: I0128 00:07:16.113368 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88283619-c3e2-4b73-b5a3-401f940acb8e-config-volume\") pod \"coredns-674b8bbfcf-6kv44\" (UID: \"88283619-c3e2-4b73-b5a3-401f940acb8e\") " pod="kube-system/coredns-674b8bbfcf-6kv44" Jan 28 00:07:16.119087 systemd[1]: Created slice kubepods-besteffort-pod0bfcf205_da0c_44de_9408_ba1ddb5cd934.slice - libcontainer container kubepods-besteffort-pod0bfcf205_da0c_44de_9408_ba1ddb5cd934.slice. Jan 28 00:07:16.125860 systemd[1]: Created slice kubepods-burstable-pod88283619_c3e2_4b73_b5a3_401f940acb8e.slice - libcontainer container kubepods-burstable-pod88283619_c3e2_4b73_b5a3_401f940acb8e.slice. Jan 28 00:07:16.133286 systemd[1]: Created slice kubepods-besteffort-pod115118f6_6ada_4444_93ef_3a99eaaacd44.slice - libcontainer container kubepods-besteffort-pod115118f6_6ada_4444_93ef_3a99eaaacd44.slice. Jan 28 00:07:16.136869 systemd[1]: Created slice kubepods-besteffort-pod6b65da67_3308_40ad_9789_eac2c3fb395b.slice - libcontainer container kubepods-besteffort-pod6b65da67_3308_40ad_9789_eac2c3fb395b.slice. Jan 28 00:07:16.400758 containerd[1660]: time="2026-01-28T00:07:16.400586760Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-564d6bbdf8-z4ccx,Uid:c30f0bb9-9acc-4a29-aba6-a8788797d9e8,Namespace:calico-apiserver,Attempt:0,}" Jan 28 00:07:16.408388 containerd[1660]: time="2026-01-28T00:07:16.408343464Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vcgk6,Uid:e8f7a78c-8d11-4e0b-a34b-f3ae93c9f74b,Namespace:kube-system,Attempt:0,}" Jan 28 00:07:16.416279 containerd[1660]: time="2026-01-28T00:07:16.416243328Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-579c57b986-v9zhd,Uid:7d3bf2f0-73b0-413a-909b-a8ef20ea0438,Namespace:calico-system,Attempt:0,}" Jan 28 00:07:16.424833 containerd[1660]: time="2026-01-28T00:07:16.424557713Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-564d6bbdf8-pmnrt,Uid:0bfcf205-da0c-44de-9408-ba1ddb5cd934,Namespace:calico-apiserver,Attempt:0,}" Jan 28 00:07:16.431333 containerd[1660]: time="2026-01-28T00:07:16.431305173Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-6kv44,Uid:88283619-c3e2-4b73-b5a3-401f940acb8e,Namespace:kube-system,Attempt:0,}" Jan 28 00:07:16.437186 containerd[1660]: time="2026-01-28T00:07:16.437149351Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-4g849,Uid:115118f6-6ada-4444-93ef-3a99eaaacd44,Namespace:calico-system,Attempt:0,}" Jan 28 00:07:16.440970 containerd[1660]: time="2026-01-28T00:07:16.440932203Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f4b7bc95d-nndks,Uid:6b65da67-3308-40ad-9789-eac2c3fb395b,Namespace:calico-system,Attempt:0,}" Jan 28 00:07:16.518826 containerd[1660]: time="2026-01-28T00:07:16.518758999Z" level=error msg="Failed to destroy network for sandbox \"239ff395137a45ddb236f2bc11bd64f2ddbd06bd2b74174592c3fcc92f320fa5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.522082 containerd[1660]: time="2026-01-28T00:07:16.522023889Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vcgk6,Uid:e8f7a78c-8d11-4e0b-a34b-f3ae93c9f74b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"239ff395137a45ddb236f2bc11bd64f2ddbd06bd2b74174592c3fcc92f320fa5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.522462 kubelet[2911]: E0128 00:07:16.522420 2911 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"239ff395137a45ddb236f2bc11bd64f2ddbd06bd2b74174592c3fcc92f320fa5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.522589 kubelet[2911]: E0128 00:07:16.522493 2911 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"239ff395137a45ddb236f2bc11bd64f2ddbd06bd2b74174592c3fcc92f320fa5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-vcgk6" Jan 28 00:07:16.522589 kubelet[2911]: E0128 00:07:16.522514 2911 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"239ff395137a45ddb236f2bc11bd64f2ddbd06bd2b74174592c3fcc92f320fa5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-vcgk6" Jan 28 00:07:16.522589 kubelet[2911]: E0128 00:07:16.522564 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-vcgk6_kube-system(e8f7a78c-8d11-4e0b-a34b-f3ae93c9f74b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-vcgk6_kube-system(e8f7a78c-8d11-4e0b-a34b-f3ae93c9f74b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"239ff395137a45ddb236f2bc11bd64f2ddbd06bd2b74174592c3fcc92f320fa5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-vcgk6" podUID="e8f7a78c-8d11-4e0b-a34b-f3ae93c9f74b" Jan 28 00:07:16.526111 containerd[1660]: time="2026-01-28T00:07:16.526043541Z" level=error msg="Failed to destroy network for sandbox \"6c42f99428cc6e16453f7b1633490e784dac1e7285de0f41d9d25c67a1cd1f45\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.530101 containerd[1660]: time="2026-01-28T00:07:16.529955633Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-564d6bbdf8-z4ccx,Uid:c30f0bb9-9acc-4a29-aba6-a8788797d9e8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c42f99428cc6e16453f7b1633490e784dac1e7285de0f41d9d25c67a1cd1f45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.530237 kubelet[2911]: E0128 00:07:16.530189 2911 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c42f99428cc6e16453f7b1633490e784dac1e7285de0f41d9d25c67a1cd1f45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.530283 kubelet[2911]: E0128 00:07:16.530260 2911 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c42f99428cc6e16453f7b1633490e784dac1e7285de0f41d9d25c67a1cd1f45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" Jan 28 00:07:16.530308 kubelet[2911]: E0128 00:07:16.530280 2911 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c42f99428cc6e16453f7b1633490e784dac1e7285de0f41d9d25c67a1cd1f45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" Jan 28 00:07:16.530367 kubelet[2911]: E0128 00:07:16.530328 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-564d6bbdf8-z4ccx_calico-apiserver(c30f0bb9-9acc-4a29-aba6-a8788797d9e8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-564d6bbdf8-z4ccx_calico-apiserver(c30f0bb9-9acc-4a29-aba6-a8788797d9e8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6c42f99428cc6e16453f7b1633490e784dac1e7285de0f41d9d25c67a1cd1f45\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:07:16.538164 containerd[1660]: time="2026-01-28T00:07:16.538042298Z" level=error msg="Failed to destroy network for sandbox \"c1290a51d12b5bf678438a2097d495d242f956b3a49d719b64ab405c815150c3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.542968 containerd[1660]: time="2026-01-28T00:07:16.542925352Z" level=error msg="Failed to destroy network for sandbox \"0a9a8f3b44b85b1d580e36a3a30c3d368437b598e8c32f1a82e31af18c652210\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.545272 containerd[1660]: time="2026-01-28T00:07:16.544936558Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-579c57b986-v9zhd,Uid:7d3bf2f0-73b0-413a-909b-a8ef20ea0438,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1290a51d12b5bf678438a2097d495d242f956b3a49d719b64ab405c815150c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.545397 kubelet[2911]: E0128 00:07:16.545354 2911 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1290a51d12b5bf678438a2097d495d242f956b3a49d719b64ab405c815150c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.545454 kubelet[2911]: E0128 00:07:16.545408 2911 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1290a51d12b5bf678438a2097d495d242f956b3a49d719b64ab405c815150c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" Jan 28 00:07:16.545454 kubelet[2911]: E0128 00:07:16.545428 2911 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1290a51d12b5bf678438a2097d495d242f956b3a49d719b64ab405c815150c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" Jan 28 00:07:16.545513 kubelet[2911]: E0128 00:07:16.545473 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-579c57b986-v9zhd_calico-system(7d3bf2f0-73b0-413a-909b-a8ef20ea0438)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-579c57b986-v9zhd_calico-system(7d3bf2f0-73b0-413a-909b-a8ef20ea0438)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c1290a51d12b5bf678438a2097d495d242f956b3a49d719b64ab405c815150c3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" podUID="7d3bf2f0-73b0-413a-909b-a8ef20ea0438" Jan 28 00:07:16.547503 containerd[1660]: time="2026-01-28T00:07:16.547468846Z" level=error msg="Failed to destroy network for sandbox \"472e5072a10e427ba4c32151f8594cfbe5d81d60d0a68a147a1bb33a8edc9806\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.549865 containerd[1660]: time="2026-01-28T00:07:16.549758893Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-564d6bbdf8-pmnrt,Uid:0bfcf205-da0c-44de-9408-ba1ddb5cd934,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a9a8f3b44b85b1d580e36a3a30c3d368437b598e8c32f1a82e31af18c652210\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.550192 kubelet[2911]: E0128 00:07:16.550156 2911 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a9a8f3b44b85b1d580e36a3a30c3d368437b598e8c32f1a82e31af18c652210\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.551048 kubelet[2911]: E0128 00:07:16.550255 2911 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a9a8f3b44b85b1d580e36a3a30c3d368437b598e8c32f1a82e31af18c652210\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" Jan 28 00:07:16.551048 kubelet[2911]: E0128 00:07:16.550280 2911 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a9a8f3b44b85b1d580e36a3a30c3d368437b598e8c32f1a82e31af18c652210\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" Jan 28 00:07:16.551048 kubelet[2911]: E0128 00:07:16.550330 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-564d6bbdf8-pmnrt_calico-apiserver(0bfcf205-da0c-44de-9408-ba1ddb5cd934)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-564d6bbdf8-pmnrt_calico-apiserver(0bfcf205-da0c-44de-9408-ba1ddb5cd934)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0a9a8f3b44b85b1d580e36a3a30c3d368437b598e8c32f1a82e31af18c652210\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" podUID="0bfcf205-da0c-44de-9408-ba1ddb5cd934" Jan 28 00:07:16.552321 containerd[1660]: time="2026-01-28T00:07:16.552226181Z" level=error msg="Failed to destroy network for sandbox \"8d820346f9065095298118b531ff74429048e05aec92da102e72320b4a1d92ff\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.552814 containerd[1660]: time="2026-01-28T00:07:16.552778182Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-4g849,Uid:115118f6-6ada-4444-93ef-3a99eaaacd44,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"472e5072a10e427ba4c32151f8594cfbe5d81d60d0a68a147a1bb33a8edc9806\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.553023 kubelet[2911]: E0128 00:07:16.552961 2911 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"472e5072a10e427ba4c32151f8594cfbe5d81d60d0a68a147a1bb33a8edc9806\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.553076 kubelet[2911]: E0128 00:07:16.553037 2911 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"472e5072a10e427ba4c32151f8594cfbe5d81d60d0a68a147a1bb33a8edc9806\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-4g849" Jan 28 00:07:16.553076 kubelet[2911]: E0128 00:07:16.553055 2911 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"472e5072a10e427ba4c32151f8594cfbe5d81d60d0a68a147a1bb33a8edc9806\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-4g849" Jan 28 00:07:16.553158 kubelet[2911]: E0128 00:07:16.553115 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-4g849_calico-system(115118f6-6ada-4444-93ef-3a99eaaacd44)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-4g849_calico-system(115118f6-6ada-4444-93ef-3a99eaaacd44)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"472e5072a10e427ba4c32151f8594cfbe5d81d60d0a68a147a1bb33a8edc9806\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:07:16.555289 containerd[1660]: time="2026-01-28T00:07:16.555198190Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-6kv44,Uid:88283619-c3e2-4b73-b5a3-401f940acb8e,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d820346f9065095298118b531ff74429048e05aec92da102e72320b4a1d92ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.555372 kubelet[2911]: E0128 00:07:16.555341 2911 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d820346f9065095298118b531ff74429048e05aec92da102e72320b4a1d92ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.555404 kubelet[2911]: E0128 00:07:16.555370 2911 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d820346f9065095298118b531ff74429048e05aec92da102e72320b4a1d92ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-6kv44" Jan 28 00:07:16.555404 kubelet[2911]: E0128 00:07:16.555385 2911 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d820346f9065095298118b531ff74429048e05aec92da102e72320b4a1d92ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-6kv44" Jan 28 00:07:16.555453 kubelet[2911]: E0128 00:07:16.555415 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-6kv44_kube-system(88283619-c3e2-4b73-b5a3-401f940acb8e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-6kv44_kube-system(88283619-c3e2-4b73-b5a3-401f940acb8e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8d820346f9065095298118b531ff74429048e05aec92da102e72320b4a1d92ff\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-6kv44" podUID="88283619-c3e2-4b73-b5a3-401f940acb8e" Jan 28 00:07:16.555598 containerd[1660]: time="2026-01-28T00:07:16.555567031Z" level=error msg="Failed to destroy network for sandbox \"6862fe0e7de6e49acc4f3818ea2eae255af9dbeaa181093dd078c4fac7896f38\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.558998 containerd[1660]: time="2026-01-28T00:07:16.558966721Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f4b7bc95d-nndks,Uid:6b65da67-3308-40ad-9789-eac2c3fb395b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6862fe0e7de6e49acc4f3818ea2eae255af9dbeaa181093dd078c4fac7896f38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.559187 kubelet[2911]: E0128 00:07:16.559124 2911 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6862fe0e7de6e49acc4f3818ea2eae255af9dbeaa181093dd078c4fac7896f38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.559284 kubelet[2911]: E0128 00:07:16.559190 2911 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6862fe0e7de6e49acc4f3818ea2eae255af9dbeaa181093dd078c4fac7896f38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6f4b7bc95d-nndks" Jan 28 00:07:16.559284 kubelet[2911]: E0128 00:07:16.559216 2911 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6862fe0e7de6e49acc4f3818ea2eae255af9dbeaa181093dd078c4fac7896f38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6f4b7bc95d-nndks" Jan 28 00:07:16.559284 kubelet[2911]: E0128 00:07:16.559263 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6f4b7bc95d-nndks_calico-system(6b65da67-3308-40ad-9789-eac2c3fb395b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6f4b7bc95d-nndks_calico-system(6b65da67-3308-40ad-9789-eac2c3fb395b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6862fe0e7de6e49acc4f3818ea2eae255af9dbeaa181093dd078c4fac7896f38\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6f4b7bc95d-nndks" podUID="6b65da67-3308-40ad-9789-eac2c3fb395b" Jan 28 00:07:16.934654 systemd[1]: Created slice kubepods-besteffort-pod33c364a8_976a_4a66_b401_5e5e3f5c6aca.slice - libcontainer container kubepods-besteffort-pod33c364a8_976a_4a66_b401_5e5e3f5c6aca.slice. Jan 28 00:07:16.936737 containerd[1660]: time="2026-01-28T00:07:16.936706828Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gql7d,Uid:33c364a8-976a-4a66-b401-5e5e3f5c6aca,Namespace:calico-system,Attempt:0,}" Jan 28 00:07:16.980257 containerd[1660]: time="2026-01-28T00:07:16.980187480Z" level=error msg="Failed to destroy network for sandbox \"c2ad7b953615cb4125bc88db361e4b05171533427ae357b86e42c21ca921c9fd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.984192 containerd[1660]: time="2026-01-28T00:07:16.984151493Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gql7d,Uid:33c364a8-976a-4a66-b401-5e5e3f5c6aca,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2ad7b953615cb4125bc88db361e4b05171533427ae357b86e42c21ca921c9fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.984411 kubelet[2911]: E0128 00:07:16.984378 2911 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2ad7b953615cb4125bc88db361e4b05171533427ae357b86e42c21ca921c9fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 28 00:07:16.984474 kubelet[2911]: E0128 00:07:16.984439 2911 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2ad7b953615cb4125bc88db361e4b05171533427ae357b86e42c21ca921c9fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gql7d" Jan 28 00:07:16.984474 kubelet[2911]: E0128 00:07:16.984458 2911 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2ad7b953615cb4125bc88db361e4b05171533427ae357b86e42c21ca921c9fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gql7d" Jan 28 00:07:16.984531 kubelet[2911]: E0128 00:07:16.984503 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gql7d_calico-system(33c364a8-976a-4a66-b401-5e5e3f5c6aca)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gql7d_calico-system(33c364a8-976a-4a66-b401-5e5e3f5c6aca)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c2ad7b953615cb4125bc88db361e4b05171533427ae357b86e42c21ca921c9fd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:07:17.013516 containerd[1660]: time="2026-01-28T00:07:17.013455462Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 28 00:07:17.406304 systemd[1]: run-netns-cni\x2da137a731\x2d3719\x2da94c\x2dcd89\x2d760d16c5dfdd.mount: Deactivated successfully. Jan 28 00:07:17.406391 systemd[1]: run-netns-cni\x2dccc61cc5\x2d5ec7\x2d05c9\x2d580a\x2d99a2a36b966c.mount: Deactivated successfully. Jan 28 00:07:17.406438 systemd[1]: run-netns-cni\x2d784e3c1c\x2d9d6e\x2d6ddc\x2de3ab\x2d17efef84001e.mount: Deactivated successfully. Jan 28 00:07:17.406480 systemd[1]: run-netns-cni\x2db75f7507\x2d049c\x2da573\x2df44f\x2d32fa06e8a60b.mount: Deactivated successfully. Jan 28 00:07:17.406521 systemd[1]: run-netns-cni\x2dd628cebe\x2d2b99\x2d8e36\x2dd801\x2df2276b7137f5.mount: Deactivated successfully. Jan 28 00:07:17.846952 kubelet[2911]: I0128 00:07:17.846502 2911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 00:07:17.873270 kernel: kauditd_printk_skb: 22 callbacks suppressed Jan 28 00:07:17.873589 kernel: audit: type=1325 audit(1769558837.869:572): table=filter:117 family=2 entries=21 op=nft_register_rule pid=3969 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:17.869000 audit[3969]: NETFILTER_CFG table=filter:117 family=2 entries=21 op=nft_register_rule pid=3969 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:17.869000 audit[3969]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffffa7dc1f0 a2=0 a3=1 items=0 ppid=3024 pid=3969 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:17.877329 kernel: audit: type=1300 audit(1769558837.869:572): arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffffa7dc1f0 a2=0 a3=1 items=0 ppid=3024 pid=3969 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:17.869000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:17.879167 kernel: audit: type=1327 audit(1769558837.869:572): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:17.878000 audit[3969]: NETFILTER_CFG table=nat:118 family=2 entries=19 op=nft_register_chain pid=3969 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:17.881988 kernel: audit: type=1325 audit(1769558837.878:573): table=nat:118 family=2 entries=19 op=nft_register_chain pid=3969 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:17.878000 audit[3969]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=fffffa7dc1f0 a2=0 a3=1 items=0 ppid=3024 pid=3969 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:17.885613 kernel: audit: type=1300 audit(1769558837.878:573): arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=fffffa7dc1f0 a2=0 a3=1 items=0 ppid=3024 pid=3969 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:17.885719 kernel: audit: type=1327 audit(1769558837.878:573): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:17.878000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:22.856589 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3340954235.mount: Deactivated successfully. Jan 28 00:07:22.874839 containerd[1660]: time="2026-01-28T00:07:22.874789264Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:07:22.875804 containerd[1660]: time="2026-01-28T00:07:22.875650627Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150930912" Jan 28 00:07:22.878508 containerd[1660]: time="2026-01-28T00:07:22.878435115Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:07:22.880241 containerd[1660]: time="2026-01-28T00:07:22.880150881Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 28 00:07:22.880946 containerd[1660]: time="2026-01-28T00:07:22.880834803Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 5.867319501s" Jan 28 00:07:22.880946 containerd[1660]: time="2026-01-28T00:07:22.880863683Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Jan 28 00:07:22.896330 containerd[1660]: time="2026-01-28T00:07:22.896298890Z" level=info msg="CreateContainer within sandbox \"fc73296dd4685b7aa1ce9424a63f9253f963b68db7528ded576520528a139ad6\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 28 00:07:22.907231 containerd[1660]: time="2026-01-28T00:07:22.906363440Z" level=info msg="Container b3d83a4ec899cff07d7550b5a2721e876a9bff425522071dd48fb877be3d9235: CDI devices from CRI Config.CDIDevices: []" Jan 28 00:07:22.915862 containerd[1660]: time="2026-01-28T00:07:22.915815629Z" level=info msg="CreateContainer within sandbox \"fc73296dd4685b7aa1ce9424a63f9253f963b68db7528ded576520528a139ad6\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"b3d83a4ec899cff07d7550b5a2721e876a9bff425522071dd48fb877be3d9235\"" Jan 28 00:07:22.916719 containerd[1660]: time="2026-01-28T00:07:22.916687991Z" level=info msg="StartContainer for \"b3d83a4ec899cff07d7550b5a2721e876a9bff425522071dd48fb877be3d9235\"" Jan 28 00:07:22.918835 containerd[1660]: time="2026-01-28T00:07:22.918806878Z" level=info msg="connecting to shim b3d83a4ec899cff07d7550b5a2721e876a9bff425522071dd48fb877be3d9235" address="unix:///run/containerd/s/f9638a4259712c3dc8a637119bb69f76cd795252fb61365401e9e8607819e033" protocol=ttrpc version=3 Jan 28 00:07:22.938405 systemd[1]: Started cri-containerd-b3d83a4ec899cff07d7550b5a2721e876a9bff425522071dd48fb877be3d9235.scope - libcontainer container b3d83a4ec899cff07d7550b5a2721e876a9bff425522071dd48fb877be3d9235. Jan 28 00:07:22.979000 audit: BPF prog-id=172 op=LOAD Jan 28 00:07:22.979000 audit[3978]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3407 pid=3978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:22.985681 kernel: audit: type=1334 audit(1769558842.979:574): prog-id=172 op=LOAD Jan 28 00:07:22.985731 kernel: audit: type=1300 audit(1769558842.979:574): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3407 pid=3978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:22.985755 kernel: audit: type=1327 audit(1769558842.979:574): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233643833613465633839396366663037643735353062356132373231 Jan 28 00:07:22.979000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233643833613465633839396366663037643735353062356132373231 Jan 28 00:07:22.979000 audit: BPF prog-id=173 op=LOAD Jan 28 00:07:22.990409 kernel: audit: type=1334 audit(1769558842.979:575): prog-id=173 op=LOAD Jan 28 00:07:22.979000 audit[3978]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3407 pid=3978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:22.994596 kernel: audit: type=1300 audit(1769558842.979:575): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3407 pid=3978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:22.994700 kernel: audit: type=1327 audit(1769558842.979:575): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233643833613465633839396366663037643735353062356132373231 Jan 28 00:07:22.979000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233643833613465633839396366663037643735353062356132373231 Jan 28 00:07:22.984000 audit: BPF prog-id=173 op=UNLOAD Jan 28 00:07:22.999452 kernel: audit: type=1334 audit(1769558842.984:576): prog-id=173 op=UNLOAD Jan 28 00:07:22.984000 audit[3978]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3407 pid=3978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:23.003415 kernel: audit: type=1300 audit(1769558842.984:576): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3407 pid=3978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:23.003637 kernel: audit: type=1327 audit(1769558842.984:576): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233643833613465633839396366663037643735353062356132373231 Jan 28 00:07:22.984000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233643833613465633839396366663037643735353062356132373231 Jan 28 00:07:22.984000 audit: BPF prog-id=172 op=UNLOAD Jan 28 00:07:23.008390 kernel: audit: type=1334 audit(1769558842.984:577): prog-id=172 op=UNLOAD Jan 28 00:07:22.984000 audit[3978]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3407 pid=3978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:22.984000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233643833613465633839396366663037643735353062356132373231 Jan 28 00:07:22.984000 audit: BPF prog-id=174 op=LOAD Jan 28 00:07:22.984000 audit[3978]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3407 pid=3978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:22.984000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233643833613465633839396366663037643735353062356132373231 Jan 28 00:07:23.030390 containerd[1660]: time="2026-01-28T00:07:23.030083176Z" level=info msg="StartContainer for \"b3d83a4ec899cff07d7550b5a2721e876a9bff425522071dd48fb877be3d9235\" returns successfully" Jan 28 00:07:23.162396 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 28 00:07:23.162575 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 28 00:07:23.364234 kubelet[2911]: I0128 00:07:23.363291 2911 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlrrv\" (UniqueName: \"kubernetes.io/projected/6b65da67-3308-40ad-9789-eac2c3fb395b-kube-api-access-mlrrv\") pod \"6b65da67-3308-40ad-9789-eac2c3fb395b\" (UID: \"6b65da67-3308-40ad-9789-eac2c3fb395b\") " Jan 28 00:07:23.364234 kubelet[2911]: I0128 00:07:23.363586 2911 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/6b65da67-3308-40ad-9789-eac2c3fb395b-whisker-backend-key-pair\") pod \"6b65da67-3308-40ad-9789-eac2c3fb395b\" (UID: \"6b65da67-3308-40ad-9789-eac2c3fb395b\") " Jan 28 00:07:23.364234 kubelet[2911]: I0128 00:07:23.363641 2911 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b65da67-3308-40ad-9789-eac2c3fb395b-whisker-ca-bundle\") pod \"6b65da67-3308-40ad-9789-eac2c3fb395b\" (UID: \"6b65da67-3308-40ad-9789-eac2c3fb395b\") " Jan 28 00:07:23.366152 kubelet[2911]: I0128 00:07:23.365146 2911 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b65da67-3308-40ad-9789-eac2c3fb395b-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "6b65da67-3308-40ad-9789-eac2c3fb395b" (UID: "6b65da67-3308-40ad-9789-eac2c3fb395b"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 28 00:07:23.366735 kubelet[2911]: I0128 00:07:23.366697 2911 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b65da67-3308-40ad-9789-eac2c3fb395b-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "6b65da67-3308-40ad-9789-eac2c3fb395b" (UID: "6b65da67-3308-40ad-9789-eac2c3fb395b"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 28 00:07:23.367101 kubelet[2911]: I0128 00:07:23.367062 2911 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b65da67-3308-40ad-9789-eac2c3fb395b-kube-api-access-mlrrv" (OuterVolumeSpecName: "kube-api-access-mlrrv") pod "6b65da67-3308-40ad-9789-eac2c3fb395b" (UID: "6b65da67-3308-40ad-9789-eac2c3fb395b"). InnerVolumeSpecName "kube-api-access-mlrrv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 28 00:07:23.467563 kubelet[2911]: I0128 00:07:23.467417 2911 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mlrrv\" (UniqueName: \"kubernetes.io/projected/6b65da67-3308-40ad-9789-eac2c3fb395b-kube-api-access-mlrrv\") on node \"ci-4593-0-0-n-ea467cc685\" DevicePath \"\"" Jan 28 00:07:23.467563 kubelet[2911]: I0128 00:07:23.467486 2911 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/6b65da67-3308-40ad-9789-eac2c3fb395b-whisker-backend-key-pair\") on node \"ci-4593-0-0-n-ea467cc685\" DevicePath \"\"" Jan 28 00:07:23.467563 kubelet[2911]: I0128 00:07:23.467516 2911 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b65da67-3308-40ad-9789-eac2c3fb395b-whisker-ca-bundle\") on node \"ci-4593-0-0-n-ea467cc685\" DevicePath \"\"" Jan 28 00:07:23.855843 systemd[1]: var-lib-kubelet-pods-6b65da67\x2d3308\x2d40ad\x2d9789\x2deac2c3fb395b-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dmlrrv.mount: Deactivated successfully. Jan 28 00:07:23.855926 systemd[1]: var-lib-kubelet-pods-6b65da67\x2d3308\x2d40ad\x2d9789\x2deac2c3fb395b-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 28 00:07:24.036339 systemd[1]: Removed slice kubepods-besteffort-pod6b65da67_3308_40ad_9789_eac2c3fb395b.slice - libcontainer container kubepods-besteffort-pod6b65da67_3308_40ad_9789_eac2c3fb395b.slice. Jan 28 00:07:24.053442 kubelet[2911]: I0128 00:07:24.053382 2911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-zkxqr" podStartSLOduration=1.761218157 podStartE2EDuration="17.053362764s" podCreationTimestamp="2026-01-28 00:07:07 +0000 UTC" firstStartedPulling="2026-01-28 00:07:07.589557078 +0000 UTC m=+22.748733335" lastFinishedPulling="2026-01-28 00:07:22.881701685 +0000 UTC m=+38.040877942" observedRunningTime="2026-01-28 00:07:24.053339284 +0000 UTC m=+39.212515541" watchObservedRunningTime="2026-01-28 00:07:24.053362764 +0000 UTC m=+39.212539021" Jan 28 00:07:24.111254 systemd[1]: Created slice kubepods-besteffort-pod944e37ab_4f3a_457c_b85f_87dff3debf4c.slice - libcontainer container kubepods-besteffort-pod944e37ab_4f3a_457c_b85f_87dff3debf4c.slice. Jan 28 00:07:24.171360 kubelet[2911]: I0128 00:07:24.171274 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/944e37ab-4f3a-457c-b85f-87dff3debf4c-whisker-ca-bundle\") pod \"whisker-7d74b86764-tnxb5\" (UID: \"944e37ab-4f3a-457c-b85f-87dff3debf4c\") " pod="calico-system/whisker-7d74b86764-tnxb5" Jan 28 00:07:24.171360 kubelet[2911]: I0128 00:07:24.171321 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/944e37ab-4f3a-457c-b85f-87dff3debf4c-whisker-backend-key-pair\") pod \"whisker-7d74b86764-tnxb5\" (UID: \"944e37ab-4f3a-457c-b85f-87dff3debf4c\") " pod="calico-system/whisker-7d74b86764-tnxb5" Jan 28 00:07:24.171636 kubelet[2911]: I0128 00:07:24.171402 2911 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-468nq\" (UniqueName: \"kubernetes.io/projected/944e37ab-4f3a-457c-b85f-87dff3debf4c-kube-api-access-468nq\") pod \"whisker-7d74b86764-tnxb5\" (UID: \"944e37ab-4f3a-457c-b85f-87dff3debf4c\") " pod="calico-system/whisker-7d74b86764-tnxb5" Jan 28 00:07:24.419801 containerd[1660]: time="2026-01-28T00:07:24.419701037Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7d74b86764-tnxb5,Uid:944e37ab-4f3a-457c-b85f-87dff3debf4c,Namespace:calico-system,Attempt:0,}" Jan 28 00:07:24.596576 systemd-networkd[1576]: cali474b0448161: Link UP Jan 28 00:07:24.597310 systemd-networkd[1576]: cali474b0448161: Gained carrier Jan 28 00:07:24.618902 containerd[1660]: 2026-01-28 00:07:24.462 [INFO][4081] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 28 00:07:24.618902 containerd[1660]: 2026-01-28 00:07:24.485 [INFO][4081] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--ea467cc685-k8s-whisker--7d74b86764--tnxb5-eth0 whisker-7d74b86764- calico-system 944e37ab-4f3a-457c-b85f-87dff3debf4c 920 0 2026-01-28 00:07:24 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7d74b86764 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4593-0-0-n-ea467cc685 whisker-7d74b86764-tnxb5 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali474b0448161 [] [] }} ContainerID="9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" Namespace="calico-system" Pod="whisker-7d74b86764-tnxb5" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-whisker--7d74b86764--tnxb5-" Jan 28 00:07:24.618902 containerd[1660]: 2026-01-28 00:07:24.485 [INFO][4081] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" Namespace="calico-system" Pod="whisker-7d74b86764-tnxb5" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-whisker--7d74b86764--tnxb5-eth0" Jan 28 00:07:24.618902 containerd[1660]: 2026-01-28 00:07:24.543 [INFO][4153] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" HandleID="k8s-pod-network.9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" Workload="ci--4593--0--0--n--ea467cc685-k8s-whisker--7d74b86764--tnxb5-eth0" Jan 28 00:07:24.619128 containerd[1660]: 2026-01-28 00:07:24.544 [INFO][4153] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" HandleID="k8s-pod-network.9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" Workload="ci--4593--0--0--n--ea467cc685-k8s-whisker--7d74b86764--tnxb5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004dba0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-n-ea467cc685", "pod":"whisker-7d74b86764-tnxb5", "timestamp":"2026-01-28 00:07:24.543940734 +0000 UTC"}, Hostname:"ci-4593-0-0-n-ea467cc685", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 00:07:24.619128 containerd[1660]: 2026-01-28 00:07:24.544 [INFO][4153] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 00:07:24.619128 containerd[1660]: 2026-01-28 00:07:24.544 [INFO][4153] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 00:07:24.619128 containerd[1660]: 2026-01-28 00:07:24.544 [INFO][4153] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-ea467cc685' Jan 28 00:07:24.619128 containerd[1660]: 2026-01-28 00:07:24.554 [INFO][4153] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:24.619128 containerd[1660]: 2026-01-28 00:07:24.560 [INFO][4153] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:24.619128 containerd[1660]: 2026-01-28 00:07:24.565 [INFO][4153] ipam/ipam.go 511: Trying affinity for 192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:24.619128 containerd[1660]: 2026-01-28 00:07:24.567 [INFO][4153] ipam/ipam.go 158: Attempting to load block cidr=192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:24.619128 containerd[1660]: 2026-01-28 00:07:24.569 [INFO][4153] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:24.619319 containerd[1660]: 2026-01-28 00:07:24.569 [INFO][4153] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.115.0/26 handle="k8s-pod-network.9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:24.619319 containerd[1660]: 2026-01-28 00:07:24.571 [INFO][4153] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d Jan 28 00:07:24.619319 containerd[1660]: 2026-01-28 00:07:24.575 [INFO][4153] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.115.0/26 handle="k8s-pod-network.9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:24.619319 containerd[1660]: 2026-01-28 00:07:24.583 [INFO][4153] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.115.1/26] block=192.168.115.0/26 handle="k8s-pod-network.9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:24.619319 containerd[1660]: 2026-01-28 00:07:24.583 [INFO][4153] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.115.1/26] handle="k8s-pod-network.9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:24.619319 containerd[1660]: 2026-01-28 00:07:24.583 [INFO][4153] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 00:07:24.619319 containerd[1660]: 2026-01-28 00:07:24.583 [INFO][4153] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.115.1/26] IPv6=[] ContainerID="9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" HandleID="k8s-pod-network.9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" Workload="ci--4593--0--0--n--ea467cc685-k8s-whisker--7d74b86764--tnxb5-eth0" Jan 28 00:07:24.619542 containerd[1660]: 2026-01-28 00:07:24.587 [INFO][4081] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" Namespace="calico-system" Pod="whisker-7d74b86764-tnxb5" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-whisker--7d74b86764--tnxb5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--ea467cc685-k8s-whisker--7d74b86764--tnxb5-eth0", GenerateName:"whisker-7d74b86764-", Namespace:"calico-system", SelfLink:"", UID:"944e37ab-4f3a-457c-b85f-87dff3debf4c", ResourceVersion:"920", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 0, 7, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7d74b86764", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-ea467cc685", ContainerID:"", Pod:"whisker-7d74b86764-tnxb5", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.115.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali474b0448161", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 00:07:24.619542 containerd[1660]: 2026-01-28 00:07:24.587 [INFO][4081] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.115.1/32] ContainerID="9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" Namespace="calico-system" Pod="whisker-7d74b86764-tnxb5" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-whisker--7d74b86764--tnxb5-eth0" Jan 28 00:07:24.619690 containerd[1660]: 2026-01-28 00:07:24.587 [INFO][4081] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali474b0448161 ContainerID="9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" Namespace="calico-system" Pod="whisker-7d74b86764-tnxb5" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-whisker--7d74b86764--tnxb5-eth0" Jan 28 00:07:24.619690 containerd[1660]: 2026-01-28 00:07:24.597 [INFO][4081] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" Namespace="calico-system" Pod="whisker-7d74b86764-tnxb5" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-whisker--7d74b86764--tnxb5-eth0" Jan 28 00:07:24.619758 containerd[1660]: 2026-01-28 00:07:24.597 [INFO][4081] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" Namespace="calico-system" Pod="whisker-7d74b86764-tnxb5" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-whisker--7d74b86764--tnxb5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--ea467cc685-k8s-whisker--7d74b86764--tnxb5-eth0", GenerateName:"whisker-7d74b86764-", Namespace:"calico-system", SelfLink:"", UID:"944e37ab-4f3a-457c-b85f-87dff3debf4c", ResourceVersion:"920", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 0, 7, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7d74b86764", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-ea467cc685", ContainerID:"9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d", Pod:"whisker-7d74b86764-tnxb5", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.115.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali474b0448161", MAC:"4e:6e:f9:56:20:78", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 00:07:24.619825 containerd[1660]: 2026-01-28 00:07:24.615 [INFO][4081] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" Namespace="calico-system" Pod="whisker-7d74b86764-tnxb5" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-whisker--7d74b86764--tnxb5-eth0" Jan 28 00:07:24.646235 containerd[1660]: time="2026-01-28T00:07:24.646067044Z" level=info msg="connecting to shim 9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d" address="unix:///run/containerd/s/91aff92c74c85d031076278629ac3e3d6255a015b3dda2e91267d629b41ebb03" namespace=k8s.io protocol=ttrpc version=3 Jan 28 00:07:24.646000 audit: BPF prog-id=175 op=LOAD Jan 28 00:07:24.646000 audit[4217]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffff2bdbb8 a2=98 a3=ffffff2bdba8 items=0 ppid=4075 pid=4217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.646000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 00:07:24.647000 audit: BPF prog-id=175 op=UNLOAD Jan 28 00:07:24.647000 audit[4217]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffff2bdb88 a3=0 items=0 ppid=4075 pid=4217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.647000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 00:07:24.647000 audit: BPF prog-id=176 op=LOAD Jan 28 00:07:24.647000 audit[4217]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffff2bda68 a2=74 a3=95 items=0 ppid=4075 pid=4217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.647000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 00:07:24.647000 audit: BPF prog-id=176 op=UNLOAD Jan 28 00:07:24.647000 audit[4217]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4075 pid=4217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.647000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 00:07:24.647000 audit: BPF prog-id=177 op=LOAD Jan 28 00:07:24.647000 audit[4217]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffff2bda98 a2=40 a3=ffffff2bdac8 items=0 ppid=4075 pid=4217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.647000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 00:07:24.647000 audit: BPF prog-id=177 op=UNLOAD Jan 28 00:07:24.647000 audit[4217]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffff2bdac8 items=0 ppid=4075 pid=4217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.647000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 28 00:07:24.648000 audit: BPF prog-id=178 op=LOAD Jan 28 00:07:24.648000 audit[4222]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd0442e08 a2=98 a3=ffffd0442df8 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.648000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.649000 audit: BPF prog-id=178 op=UNLOAD Jan 28 00:07:24.649000 audit[4222]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd0442dd8 a3=0 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.649000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.649000 audit: BPF prog-id=179 op=LOAD Jan 28 00:07:24.649000 audit[4222]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffd0442a98 a2=74 a3=95 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.649000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.649000 audit: BPF prog-id=179 op=UNLOAD Jan 28 00:07:24.649000 audit[4222]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.649000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.649000 audit: BPF prog-id=180 op=LOAD Jan 28 00:07:24.649000 audit[4222]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffd0442af8 a2=94 a3=2 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.649000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.649000 audit: BPF prog-id=180 op=UNLOAD Jan 28 00:07:24.649000 audit[4222]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.649000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.686647 systemd[1]: Started cri-containerd-9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d.scope - libcontainer container 9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d. Jan 28 00:07:24.696000 audit: BPF prog-id=181 op=LOAD Jan 28 00:07:24.696000 audit: BPF prog-id=182 op=LOAD Jan 28 00:07:24.696000 audit[4229]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4214 pid=4229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.696000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962616430643366613932666533313965306236326336636137663534 Jan 28 00:07:24.697000 audit: BPF prog-id=182 op=UNLOAD Jan 28 00:07:24.697000 audit[4229]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4214 pid=4229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.697000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962616430643366613932666533313965306236326336636137663534 Jan 28 00:07:24.697000 audit: BPF prog-id=183 op=LOAD Jan 28 00:07:24.697000 audit[4229]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4214 pid=4229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.697000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962616430643366613932666533313965306236326336636137663534 Jan 28 00:07:24.697000 audit: BPF prog-id=184 op=LOAD Jan 28 00:07:24.697000 audit[4229]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4214 pid=4229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.697000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962616430643366613932666533313965306236326336636137663534 Jan 28 00:07:24.697000 audit: BPF prog-id=184 op=UNLOAD Jan 28 00:07:24.697000 audit[4229]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4214 pid=4229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.697000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962616430643366613932666533313965306236326336636137663534 Jan 28 00:07:24.698000 audit: BPF prog-id=183 op=UNLOAD Jan 28 00:07:24.698000 audit[4229]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4214 pid=4229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.698000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962616430643366613932666533313965306236326336636137663534 Jan 28 00:07:24.698000 audit: BPF prog-id=185 op=LOAD Jan 28 00:07:24.698000 audit[4229]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4214 pid=4229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.698000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962616430643366613932666533313965306236326336636137663534 Jan 28 00:07:24.721047 containerd[1660]: time="2026-01-28T00:07:24.721008232Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7d74b86764-tnxb5,Uid:944e37ab-4f3a-457c-b85f-87dff3debf4c,Namespace:calico-system,Attempt:0,} returns sandbox id \"9bad0d3fa92fe319e0b62c6ca7f542b29d0ee5abb3a62e7c7c68a300bf4fda4d\"" Jan 28 00:07:24.722475 containerd[1660]: time="2026-01-28T00:07:24.722432916Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 28 00:07:24.760000 audit: BPF prog-id=186 op=LOAD Jan 28 00:07:24.760000 audit[4222]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffd0442ab8 a2=40 a3=ffffd0442ae8 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.760000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.760000 audit: BPF prog-id=186 op=UNLOAD Jan 28 00:07:24.760000 audit[4222]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffd0442ae8 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.760000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.770000 audit: BPF prog-id=187 op=LOAD Jan 28 00:07:24.770000 audit[4222]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffd0442ac8 a2=94 a3=4 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.770000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.770000 audit: BPF prog-id=187 op=UNLOAD Jan 28 00:07:24.770000 audit[4222]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.770000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.770000 audit: BPF prog-id=188 op=LOAD Jan 28 00:07:24.770000 audit[4222]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd0442908 a2=94 a3=5 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.770000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.770000 audit: BPF prog-id=188 op=UNLOAD Jan 28 00:07:24.770000 audit[4222]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.770000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.770000 audit: BPF prog-id=189 op=LOAD Jan 28 00:07:24.770000 audit[4222]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffd0442b38 a2=94 a3=6 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.770000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.770000 audit: BPF prog-id=189 op=UNLOAD Jan 28 00:07:24.770000 audit[4222]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.770000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.771000 audit: BPF prog-id=190 op=LOAD Jan 28 00:07:24.771000 audit[4222]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffd0442308 a2=94 a3=83 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.771000 audit: BPF prog-id=191 op=LOAD Jan 28 00:07:24.771000 audit[4222]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffd04420c8 a2=94 a3=2 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.771000 audit: BPF prog-id=191 op=UNLOAD Jan 28 00:07:24.771000 audit[4222]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.771000 audit: BPF prog-id=190 op=UNLOAD Jan 28 00:07:24.771000 audit[4222]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=36879620 a3=3686cb00 items=0 ppid=4075 pid=4222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 28 00:07:24.781000 audit: BPF prog-id=192 op=LOAD Jan 28 00:07:24.781000 audit[4257]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff4d07308 a2=98 a3=fffff4d072f8 items=0 ppid=4075 pid=4257 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.781000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 00:07:24.781000 audit: BPF prog-id=192 op=UNLOAD Jan 28 00:07:24.781000 audit[4257]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=fffff4d072d8 a3=0 items=0 ppid=4075 pid=4257 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.781000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 00:07:24.781000 audit: BPF prog-id=193 op=LOAD Jan 28 00:07:24.781000 audit[4257]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff4d071b8 a2=74 a3=95 items=0 ppid=4075 pid=4257 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.781000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 00:07:24.781000 audit: BPF prog-id=193 op=UNLOAD Jan 28 00:07:24.781000 audit[4257]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4075 pid=4257 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.781000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 00:07:24.781000 audit: BPF prog-id=194 op=LOAD Jan 28 00:07:24.781000 audit[4257]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff4d071e8 a2=40 a3=fffff4d07218 items=0 ppid=4075 pid=4257 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.781000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 00:07:24.781000 audit: BPF prog-id=194 op=UNLOAD Jan 28 00:07:24.781000 audit[4257]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=fffff4d07218 items=0 ppid=4075 pid=4257 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.781000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 28 00:07:24.837015 systemd-networkd[1576]: vxlan.calico: Link UP Jan 28 00:07:24.837021 systemd-networkd[1576]: vxlan.calico: Gained carrier Jan 28 00:07:24.854000 audit: BPF prog-id=195 op=LOAD Jan 28 00:07:24.854000 audit[4281]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd18191a8 a2=98 a3=ffffd1819198 items=0 ppid=4075 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.854000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 00:07:24.854000 audit: BPF prog-id=195 op=UNLOAD Jan 28 00:07:24.854000 audit[4281]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd1819178 a3=0 items=0 ppid=4075 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.854000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 00:07:24.854000 audit: BPF prog-id=196 op=LOAD Jan 28 00:07:24.854000 audit[4281]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd1818e88 a2=74 a3=95 items=0 ppid=4075 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.854000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 00:07:24.854000 audit: BPF prog-id=196 op=UNLOAD Jan 28 00:07:24.854000 audit[4281]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4075 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.854000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 00:07:24.854000 audit: BPF prog-id=197 op=LOAD Jan 28 00:07:24.854000 audit[4281]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd1818ee8 a2=94 a3=2 items=0 ppid=4075 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.854000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 00:07:24.854000 audit: BPF prog-id=197 op=UNLOAD Jan 28 00:07:24.854000 audit[4281]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=70 a3=2 items=0 ppid=4075 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.854000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 00:07:24.854000 audit: BPF prog-id=198 op=LOAD Jan 28 00:07:24.854000 audit[4281]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd1818d68 a2=40 a3=ffffd1818d98 items=0 ppid=4075 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.854000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 00:07:24.854000 audit: BPF prog-id=198 op=UNLOAD Jan 28 00:07:24.854000 audit[4281]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=40 a3=ffffd1818d98 items=0 ppid=4075 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.854000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 00:07:24.854000 audit: BPF prog-id=199 op=LOAD Jan 28 00:07:24.854000 audit[4281]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd1818eb8 a2=94 a3=b7 items=0 ppid=4075 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.854000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 00:07:24.854000 audit: BPF prog-id=199 op=UNLOAD Jan 28 00:07:24.854000 audit[4281]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=b7 items=0 ppid=4075 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.854000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 00:07:24.856000 audit: BPF prog-id=200 op=LOAD Jan 28 00:07:24.856000 audit[4281]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd1818568 a2=94 a3=2 items=0 ppid=4075 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.856000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 00:07:24.856000 audit: BPF prog-id=200 op=UNLOAD Jan 28 00:07:24.856000 audit[4281]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=2 items=0 ppid=4075 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.856000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 00:07:24.856000 audit: BPF prog-id=201 op=LOAD Jan 28 00:07:24.856000 audit[4281]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd18186f8 a2=94 a3=30 items=0 ppid=4075 pid=4281 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.856000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 28 00:07:24.859000 audit: BPF prog-id=202 op=LOAD Jan 28 00:07:24.859000 audit[4285]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd7c299e8 a2=98 a3=ffffd7c299d8 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.859000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.859000 audit: BPF prog-id=202 op=UNLOAD Jan 28 00:07:24.859000 audit[4285]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd7c299b8 a3=0 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.859000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.859000 audit: BPF prog-id=203 op=LOAD Jan 28 00:07:24.859000 audit[4285]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffd7c29678 a2=74 a3=95 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.859000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.859000 audit: BPF prog-id=203 op=UNLOAD Jan 28 00:07:24.859000 audit[4285]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.859000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.859000 audit: BPF prog-id=204 op=LOAD Jan 28 00:07:24.859000 audit[4285]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffd7c296d8 a2=94 a3=2 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.859000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.859000 audit: BPF prog-id=204 op=UNLOAD Jan 28 00:07:24.859000 audit[4285]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.859000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.930589 kubelet[2911]: I0128 00:07:24.930529 2911 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b65da67-3308-40ad-9789-eac2c3fb395b" path="/var/lib/kubelet/pods/6b65da67-3308-40ad-9789-eac2c3fb395b/volumes" Jan 28 00:07:24.960000 audit: BPF prog-id=205 op=LOAD Jan 28 00:07:24.960000 audit[4285]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffd7c29698 a2=40 a3=ffffd7c296c8 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.960000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.960000 audit: BPF prog-id=205 op=UNLOAD Jan 28 00:07:24.960000 audit[4285]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffd7c296c8 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.960000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.969000 audit: BPF prog-id=206 op=LOAD Jan 28 00:07:24.969000 audit[4285]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffd7c296a8 a2=94 a3=4 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.969000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.969000 audit: BPF prog-id=206 op=UNLOAD Jan 28 00:07:24.969000 audit[4285]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.969000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.969000 audit: BPF prog-id=207 op=LOAD Jan 28 00:07:24.969000 audit[4285]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd7c294e8 a2=94 a3=5 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.969000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.969000 audit: BPF prog-id=207 op=UNLOAD Jan 28 00:07:24.969000 audit[4285]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.969000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.969000 audit: BPF prog-id=208 op=LOAD Jan 28 00:07:24.969000 audit[4285]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffd7c29718 a2=94 a3=6 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.969000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.969000 audit: BPF prog-id=208 op=UNLOAD Jan 28 00:07:24.969000 audit[4285]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.969000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.970000 audit: BPF prog-id=209 op=LOAD Jan 28 00:07:24.970000 audit[4285]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffd7c28ee8 a2=94 a3=83 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.970000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.970000 audit: BPF prog-id=210 op=LOAD Jan 28 00:07:24.970000 audit[4285]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffd7c28ca8 a2=94 a3=2 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.970000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.970000 audit: BPF prog-id=210 op=UNLOAD Jan 28 00:07:24.970000 audit[4285]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.970000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.970000 audit: BPF prog-id=209 op=UNLOAD Jan 28 00:07:24.970000 audit[4285]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=7706620 a3=76f9b00 items=0 ppid=4075 pid=4285 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.970000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 28 00:07:24.988000 audit: BPF prog-id=201 op=UNLOAD Jan 28 00:07:24.988000 audit[4075]: SYSCALL arch=c00000b7 syscall=35 success=yes exit=0 a0=ffffffffffffff9c a1=400078a300 a2=0 a3=0 items=0 ppid=4050 pid=4075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:24.988000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 28 00:07:25.031000 audit[4313]: NETFILTER_CFG table=mangle:119 family=2 entries=16 op=nft_register_chain pid=4313 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 00:07:25.031000 audit[4313]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=fffff1afd110 a2=0 a3=ffff8b403fa8 items=0 ppid=4075 pid=4313 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:25.031000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 00:07:25.032000 audit[4314]: NETFILTER_CFG table=nat:120 family=2 entries=15 op=nft_register_chain pid=4314 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 00:07:25.032000 audit[4314]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=ffffe49dbbf0 a2=0 a3=ffff95042fa8 items=0 ppid=4075 pid=4314 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:25.032000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 00:07:25.040000 audit[4312]: NETFILTER_CFG table=raw:121 family=2 entries=21 op=nft_register_chain pid=4312 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 00:07:25.040000 audit[4312]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=ffffde992c30 a2=0 a3=ffffaa53efa8 items=0 ppid=4075 pid=4312 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:25.040000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 00:07:25.062400 containerd[1660]: time="2026-01-28T00:07:25.062365469Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:07:25.063806 containerd[1660]: time="2026-01-28T00:07:25.063776793Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 28 00:07:25.063851 containerd[1660]: time="2026-01-28T00:07:25.063815673Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 28 00:07:25.064047 kubelet[2911]: E0128 00:07:25.064008 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 00:07:25.064111 kubelet[2911]: E0128 00:07:25.064057 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 00:07:25.064251 kubelet[2911]: E0128 00:07:25.064191 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:317bcf13c30d4d6c9248c8e05fdeda91,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-468nq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d74b86764-tnxb5_calico-system(944e37ab-4f3a-457c-b85f-87dff3debf4c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 28 00:07:25.066537 containerd[1660]: time="2026-01-28T00:07:25.066513081Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 28 00:07:25.047000 audit[4316]: NETFILTER_CFG table=filter:122 family=2 entries=94 op=nft_register_chain pid=4316 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 00:07:25.047000 audit[4316]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=53116 a0=3 a1=ffffef019130 a2=0 a3=ffff93fa9fa8 items=0 ppid=4075 pid=4316 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:25.047000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 00:07:25.415249 containerd[1660]: time="2026-01-28T00:07:25.415118300Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:07:25.416739 containerd[1660]: time="2026-01-28T00:07:25.416690145Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 28 00:07:25.416804 containerd[1660]: time="2026-01-28T00:07:25.416771825Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 28 00:07:25.416977 kubelet[2911]: E0128 00:07:25.416937 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 00:07:25.417021 kubelet[2911]: E0128 00:07:25.416987 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 00:07:25.417177 kubelet[2911]: E0128 00:07:25.417107 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-468nq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d74b86764-tnxb5_calico-system(944e37ab-4f3a-457c-b85f-87dff3debf4c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 28 00:07:25.418317 kubelet[2911]: E0128 00:07:25.418282 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d74b86764-tnxb5" podUID="944e37ab-4f3a-457c-b85f-87dff3debf4c" Jan 28 00:07:25.781370 systemd-networkd[1576]: cali474b0448161: Gained IPv6LL Jan 28 00:07:26.041312 kubelet[2911]: E0128 00:07:26.040653 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d74b86764-tnxb5" podUID="944e37ab-4f3a-457c-b85f-87dff3debf4c" Jan 28 00:07:26.062000 audit[4327]: NETFILTER_CFG table=filter:123 family=2 entries=20 op=nft_register_rule pid=4327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:26.062000 audit[4327]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffffe991600 a2=0 a3=1 items=0 ppid=3024 pid=4327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:26.062000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:26.075000 audit[4327]: NETFILTER_CFG table=nat:124 family=2 entries=14 op=nft_register_rule pid=4327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:26.075000 audit[4327]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=fffffe991600 a2=0 a3=1 items=0 ppid=3024 pid=4327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:26.075000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:26.485403 systemd-networkd[1576]: vxlan.calico: Gained IPv6LL Jan 28 00:07:27.929412 containerd[1660]: time="2026-01-28T00:07:27.929368617Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-564d6bbdf8-z4ccx,Uid:c30f0bb9-9acc-4a29-aba6-a8788797d9e8,Namespace:calico-apiserver,Attempt:0,}" Jan 28 00:07:27.930025 containerd[1660]: time="2026-01-28T00:07:27.929998498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-579c57b986-v9zhd,Uid:7d3bf2f0-73b0-413a-909b-a8ef20ea0438,Namespace:calico-system,Attempt:0,}" Jan 28 00:07:28.039484 systemd-networkd[1576]: cali729bbc17cd1: Link UP Jan 28 00:07:28.040080 systemd-networkd[1576]: cali729bbc17cd1: Gained carrier Jan 28 00:07:28.057140 containerd[1660]: 2026-01-28 00:07:27.978 [INFO][4342] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--ea467cc685-k8s-calico--kube--controllers--579c57b986--v9zhd-eth0 calico-kube-controllers-579c57b986- calico-system 7d3bf2f0-73b0-413a-909b-a8ef20ea0438 851 0 2026-01-28 00:07:07 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:579c57b986 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4593-0-0-n-ea467cc685 calico-kube-controllers-579c57b986-v9zhd eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali729bbc17cd1 [] [] }} ContainerID="cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" Namespace="calico-system" Pod="calico-kube-controllers-579c57b986-v9zhd" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--kube--controllers--579c57b986--v9zhd-" Jan 28 00:07:28.057140 containerd[1660]: 2026-01-28 00:07:27.978 [INFO][4342] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" Namespace="calico-system" Pod="calico-kube-controllers-579c57b986-v9zhd" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--kube--controllers--579c57b986--v9zhd-eth0" Jan 28 00:07:28.057140 containerd[1660]: 2026-01-28 00:07:28.000 [INFO][4359] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" HandleID="k8s-pod-network.cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" Workload="ci--4593--0--0--n--ea467cc685-k8s-calico--kube--controllers--579c57b986--v9zhd-eth0" Jan 28 00:07:28.057428 containerd[1660]: 2026-01-28 00:07:28.000 [INFO][4359] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" HandleID="k8s-pod-network.cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" Workload="ci--4593--0--0--n--ea467cc685-k8s-calico--kube--controllers--579c57b986--v9zhd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001a3640), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-n-ea467cc685", "pod":"calico-kube-controllers-579c57b986-v9zhd", "timestamp":"2026-01-28 00:07:28.000159472 +0000 UTC"}, Hostname:"ci-4593-0-0-n-ea467cc685", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 00:07:28.057428 containerd[1660]: 2026-01-28 00:07:28.000 [INFO][4359] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 00:07:28.057428 containerd[1660]: 2026-01-28 00:07:28.000 [INFO][4359] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 00:07:28.057428 containerd[1660]: 2026-01-28 00:07:28.000 [INFO][4359] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-ea467cc685' Jan 28 00:07:28.057428 containerd[1660]: 2026-01-28 00:07:28.010 [INFO][4359] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.057428 containerd[1660]: 2026-01-28 00:07:28.014 [INFO][4359] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.057428 containerd[1660]: 2026-01-28 00:07:28.018 [INFO][4359] ipam/ipam.go 511: Trying affinity for 192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.057428 containerd[1660]: 2026-01-28 00:07:28.020 [INFO][4359] ipam/ipam.go 158: Attempting to load block cidr=192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.057428 containerd[1660]: 2026-01-28 00:07:28.022 [INFO][4359] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.057710 containerd[1660]: 2026-01-28 00:07:28.022 [INFO][4359] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.115.0/26 handle="k8s-pod-network.cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.057710 containerd[1660]: 2026-01-28 00:07:28.023 [INFO][4359] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18 Jan 28 00:07:28.057710 containerd[1660]: 2026-01-28 00:07:28.028 [INFO][4359] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.115.0/26 handle="k8s-pod-network.cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.057710 containerd[1660]: 2026-01-28 00:07:28.035 [INFO][4359] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.115.2/26] block=192.168.115.0/26 handle="k8s-pod-network.cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.057710 containerd[1660]: 2026-01-28 00:07:28.035 [INFO][4359] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.115.2/26] handle="k8s-pod-network.cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.057710 containerd[1660]: 2026-01-28 00:07:28.035 [INFO][4359] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 00:07:28.057710 containerd[1660]: 2026-01-28 00:07:28.035 [INFO][4359] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.115.2/26] IPv6=[] ContainerID="cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" HandleID="k8s-pod-network.cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" Workload="ci--4593--0--0--n--ea467cc685-k8s-calico--kube--controllers--579c57b986--v9zhd-eth0" Jan 28 00:07:28.057840 containerd[1660]: 2026-01-28 00:07:28.037 [INFO][4342] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" Namespace="calico-system" Pod="calico-kube-controllers-579c57b986-v9zhd" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--kube--controllers--579c57b986--v9zhd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--ea467cc685-k8s-calico--kube--controllers--579c57b986--v9zhd-eth0", GenerateName:"calico-kube-controllers-579c57b986-", Namespace:"calico-system", SelfLink:"", UID:"7d3bf2f0-73b0-413a-909b-a8ef20ea0438", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 0, 7, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"579c57b986", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-ea467cc685", ContainerID:"", Pod:"calico-kube-controllers-579c57b986-v9zhd", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.115.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali729bbc17cd1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 00:07:28.057891 containerd[1660]: 2026-01-28 00:07:28.037 [INFO][4342] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.115.2/32] ContainerID="cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" Namespace="calico-system" Pod="calico-kube-controllers-579c57b986-v9zhd" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--kube--controllers--579c57b986--v9zhd-eth0" Jan 28 00:07:28.057891 containerd[1660]: 2026-01-28 00:07:28.037 [INFO][4342] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali729bbc17cd1 ContainerID="cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" Namespace="calico-system" Pod="calico-kube-controllers-579c57b986-v9zhd" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--kube--controllers--579c57b986--v9zhd-eth0" Jan 28 00:07:28.057891 containerd[1660]: 2026-01-28 00:07:28.039 [INFO][4342] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" Namespace="calico-system" Pod="calico-kube-controllers-579c57b986-v9zhd" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--kube--controllers--579c57b986--v9zhd-eth0" Jan 28 00:07:28.057950 containerd[1660]: 2026-01-28 00:07:28.041 [INFO][4342] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" Namespace="calico-system" Pod="calico-kube-controllers-579c57b986-v9zhd" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--kube--controllers--579c57b986--v9zhd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--ea467cc685-k8s-calico--kube--controllers--579c57b986--v9zhd-eth0", GenerateName:"calico-kube-controllers-579c57b986-", Namespace:"calico-system", SelfLink:"", UID:"7d3bf2f0-73b0-413a-909b-a8ef20ea0438", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 0, 7, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"579c57b986", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-ea467cc685", ContainerID:"cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18", Pod:"calico-kube-controllers-579c57b986-v9zhd", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.115.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali729bbc17cd1", MAC:"32:b8:30:c9:93:21", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 00:07:28.057994 containerd[1660]: 2026-01-28 00:07:28.054 [INFO][4342] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" Namespace="calico-system" Pod="calico-kube-controllers-579c57b986-v9zhd" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--kube--controllers--579c57b986--v9zhd-eth0" Jan 28 00:07:28.067000 audit[4383]: NETFILTER_CFG table=filter:125 family=2 entries=36 op=nft_register_chain pid=4383 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 00:07:28.069462 kernel: kauditd_printk_skb: 231 callbacks suppressed Jan 28 00:07:28.069518 kernel: audit: type=1325 audit(1769558848.067:655): table=filter:125 family=2 entries=36 op=nft_register_chain pid=4383 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 00:07:28.067000 audit[4383]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19576 a0=3 a1=fffffe063cf0 a2=0 a3=ffff939befa8 items=0 ppid=4075 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.075631 kernel: audit: type=1300 audit(1769558848.067:655): arch=c00000b7 syscall=211 success=yes exit=19576 a0=3 a1=fffffe063cf0 a2=0 a3=ffff939befa8 items=0 ppid=4075 pid=4383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.067000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 00:07:28.078675 kernel: audit: type=1327 audit(1769558848.067:655): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 00:07:28.079340 containerd[1660]: time="2026-01-28T00:07:28.079304632Z" level=info msg="connecting to shim cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18" address="unix:///run/containerd/s/2e66015b7a4c8b7678ef1c3279263c15d82b506e21938828dfd74e32ccb4dbac" namespace=k8s.io protocol=ttrpc version=3 Jan 28 00:07:28.098381 systemd[1]: Started cri-containerd-cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18.scope - libcontainer container cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18. Jan 28 00:07:28.106000 audit: BPF prog-id=211 op=LOAD Jan 28 00:07:28.106000 audit: BPF prog-id=212 op=LOAD Jan 28 00:07:28.109481 kernel: audit: type=1334 audit(1769558848.106:656): prog-id=211 op=LOAD Jan 28 00:07:28.109518 kernel: audit: type=1334 audit(1769558848.106:657): prog-id=212 op=LOAD Jan 28 00:07:28.106000 audit[4402]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4392 pid=4402 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.113292 kernel: audit: type=1300 audit(1769558848.106:657): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4392 pid=4402 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.106000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366656465343230626465313464343462313462373265393937363230 Jan 28 00:07:28.116595 kernel: audit: type=1327 audit(1769558848.106:657): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366656465343230626465313464343462313462373265393937363230 Jan 28 00:07:28.117319 kernel: audit: type=1334 audit(1769558848.107:658): prog-id=212 op=UNLOAD Jan 28 00:07:28.117357 kernel: audit: type=1300 audit(1769558848.107:658): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4392 pid=4402 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.117389 kernel: audit: type=1327 audit(1769558848.107:658): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366656465343230626465313464343462313462373265393937363230 Jan 28 00:07:28.107000 audit: BPF prog-id=212 op=UNLOAD Jan 28 00:07:28.107000 audit[4402]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4392 pid=4402 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.107000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366656465343230626465313464343462313462373265393937363230 Jan 28 00:07:28.107000 audit: BPF prog-id=213 op=LOAD Jan 28 00:07:28.107000 audit[4402]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4392 pid=4402 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.107000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366656465343230626465313464343462313462373265393937363230 Jan 28 00:07:28.108000 audit: BPF prog-id=214 op=LOAD Jan 28 00:07:28.108000 audit[4402]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4392 pid=4402 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.108000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366656465343230626465313464343462313462373265393937363230 Jan 28 00:07:28.108000 audit: BPF prog-id=214 op=UNLOAD Jan 28 00:07:28.108000 audit[4402]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4392 pid=4402 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.108000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366656465343230626465313464343462313462373265393937363230 Jan 28 00:07:28.108000 audit: BPF prog-id=213 op=UNLOAD Jan 28 00:07:28.108000 audit[4402]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4392 pid=4402 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.108000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366656465343230626465313464343462313462373265393937363230 Jan 28 00:07:28.108000 audit: BPF prog-id=215 op=LOAD Jan 28 00:07:28.108000 audit[4402]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4392 pid=4402 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.108000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6366656465343230626465313464343462313462373265393937363230 Jan 28 00:07:28.157247 containerd[1660]: time="2026-01-28T00:07:28.156909468Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-579c57b986-v9zhd,Uid:7d3bf2f0-73b0-413a-909b-a8ef20ea0438,Namespace:calico-system,Attempt:0,} returns sandbox id \"cfede420bde14d44b14b72e997620bb3d52dfa22965be5ab65737a563cc71d18\"" Jan 28 00:07:28.157045 systemd-networkd[1576]: cali4c1680d9ba8: Link UP Jan 28 00:07:28.157542 systemd-networkd[1576]: cali4c1680d9ba8: Gained carrier Jan 28 00:07:28.160329 containerd[1660]: time="2026-01-28T00:07:28.160109997Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 28 00:07:28.173640 containerd[1660]: 2026-01-28 00:07:27.978 [INFO][4330] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--z4ccx-eth0 calico-apiserver-564d6bbdf8- calico-apiserver c30f0bb9-9acc-4a29-aba6-a8788797d9e8 846 0 2026-01-28 00:07:00 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:564d6bbdf8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4593-0-0-n-ea467cc685 calico-apiserver-564d6bbdf8-z4ccx eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali4c1680d9ba8 [] [] }} ContainerID="6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" Namespace="calico-apiserver" Pod="calico-apiserver-564d6bbdf8-z4ccx" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--z4ccx-" Jan 28 00:07:28.173640 containerd[1660]: 2026-01-28 00:07:27.978 [INFO][4330] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" Namespace="calico-apiserver" Pod="calico-apiserver-564d6bbdf8-z4ccx" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--z4ccx-eth0" Jan 28 00:07:28.173640 containerd[1660]: 2026-01-28 00:07:28.002 [INFO][4360] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" HandleID="k8s-pod-network.6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" Workload="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--z4ccx-eth0" Jan 28 00:07:28.174006 containerd[1660]: 2026-01-28 00:07:28.003 [INFO][4360] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" HandleID="k8s-pod-network.6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" Workload="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--z4ccx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000137540), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4593-0-0-n-ea467cc685", "pod":"calico-apiserver-564d6bbdf8-z4ccx", "timestamp":"2026-01-28 00:07:28.002694639 +0000 UTC"}, Hostname:"ci-4593-0-0-n-ea467cc685", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 00:07:28.174006 containerd[1660]: 2026-01-28 00:07:28.003 [INFO][4360] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 00:07:28.174006 containerd[1660]: 2026-01-28 00:07:28.035 [INFO][4360] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 00:07:28.174006 containerd[1660]: 2026-01-28 00:07:28.036 [INFO][4360] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-ea467cc685' Jan 28 00:07:28.174006 containerd[1660]: 2026-01-28 00:07:28.118 [INFO][4360] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.174006 containerd[1660]: 2026-01-28 00:07:28.124 [INFO][4360] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.174006 containerd[1660]: 2026-01-28 00:07:28.129 [INFO][4360] ipam/ipam.go 511: Trying affinity for 192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.174006 containerd[1660]: 2026-01-28 00:07:28.133 [INFO][4360] ipam/ipam.go 158: Attempting to load block cidr=192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.174006 containerd[1660]: 2026-01-28 00:07:28.135 [INFO][4360] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.174224 containerd[1660]: 2026-01-28 00:07:28.135 [INFO][4360] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.115.0/26 handle="k8s-pod-network.6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.174224 containerd[1660]: 2026-01-28 00:07:28.137 [INFO][4360] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27 Jan 28 00:07:28.174224 containerd[1660]: 2026-01-28 00:07:28.144 [INFO][4360] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.115.0/26 handle="k8s-pod-network.6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.174224 containerd[1660]: 2026-01-28 00:07:28.150 [INFO][4360] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.115.3/26] block=192.168.115.0/26 handle="k8s-pod-network.6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.174224 containerd[1660]: 2026-01-28 00:07:28.150 [INFO][4360] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.115.3/26] handle="k8s-pod-network.6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:28.174224 containerd[1660]: 2026-01-28 00:07:28.150 [INFO][4360] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 00:07:28.174224 containerd[1660]: 2026-01-28 00:07:28.150 [INFO][4360] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.115.3/26] IPv6=[] ContainerID="6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" HandleID="k8s-pod-network.6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" Workload="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--z4ccx-eth0" Jan 28 00:07:28.174357 containerd[1660]: 2026-01-28 00:07:28.154 [INFO][4330] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" Namespace="calico-apiserver" Pod="calico-apiserver-564d6bbdf8-z4ccx" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--z4ccx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--z4ccx-eth0", GenerateName:"calico-apiserver-564d6bbdf8-", Namespace:"calico-apiserver", SelfLink:"", UID:"c30f0bb9-9acc-4a29-aba6-a8788797d9e8", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 0, 7, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"564d6bbdf8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-ea467cc685", ContainerID:"", Pod:"calico-apiserver-564d6bbdf8-z4ccx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.115.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4c1680d9ba8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 00:07:28.174448 containerd[1660]: 2026-01-28 00:07:28.154 [INFO][4330] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.115.3/32] ContainerID="6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" Namespace="calico-apiserver" Pod="calico-apiserver-564d6bbdf8-z4ccx" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--z4ccx-eth0" Jan 28 00:07:28.174448 containerd[1660]: 2026-01-28 00:07:28.154 [INFO][4330] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4c1680d9ba8 ContainerID="6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" Namespace="calico-apiserver" Pod="calico-apiserver-564d6bbdf8-z4ccx" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--z4ccx-eth0" Jan 28 00:07:28.174448 containerd[1660]: 2026-01-28 00:07:28.157 [INFO][4330] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" Namespace="calico-apiserver" Pod="calico-apiserver-564d6bbdf8-z4ccx" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--z4ccx-eth0" Jan 28 00:07:28.175303 containerd[1660]: 2026-01-28 00:07:28.158 [INFO][4330] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" Namespace="calico-apiserver" Pod="calico-apiserver-564d6bbdf8-z4ccx" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--z4ccx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--z4ccx-eth0", GenerateName:"calico-apiserver-564d6bbdf8-", Namespace:"calico-apiserver", SelfLink:"", UID:"c30f0bb9-9acc-4a29-aba6-a8788797d9e8", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 0, 7, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"564d6bbdf8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-ea467cc685", ContainerID:"6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27", Pod:"calico-apiserver-564d6bbdf8-z4ccx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.115.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4c1680d9ba8", MAC:"2a:05:b7:99:81:dd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 00:07:28.175367 containerd[1660]: 2026-01-28 00:07:28.171 [INFO][4330] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" Namespace="calico-apiserver" Pod="calico-apiserver-564d6bbdf8-z4ccx" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--z4ccx-eth0" Jan 28 00:07:28.185000 audit[4439]: NETFILTER_CFG table=filter:126 family=2 entries=60 op=nft_register_chain pid=4439 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 00:07:28.185000 audit[4439]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=32248 a0=3 a1=ffffe1ab87c0 a2=0 a3=ffff93baefa8 items=0 ppid=4075 pid=4439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.185000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 00:07:28.198280 containerd[1660]: time="2026-01-28T00:07:28.198203073Z" level=info msg="connecting to shim 6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27" address="unix:///run/containerd/s/f1e4837e997ba5d8c015a8f186b7174c724d74359d8a41a378f41d2b749a85ff" namespace=k8s.io protocol=ttrpc version=3 Jan 28 00:07:28.220436 systemd[1]: Started cri-containerd-6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27.scope - libcontainer container 6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27. Jan 28 00:07:28.228000 audit: BPF prog-id=216 op=LOAD Jan 28 00:07:28.228000 audit: BPF prog-id=217 op=LOAD Jan 28 00:07:28.228000 audit[4460]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4447 pid=4460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666353538363663353731366434353330343363353135343638313334 Jan 28 00:07:28.228000 audit: BPF prog-id=217 op=UNLOAD Jan 28 00:07:28.228000 audit[4460]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4447 pid=4460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666353538363663353731366434353330343363353135343638313334 Jan 28 00:07:28.228000 audit: BPF prog-id=218 op=LOAD Jan 28 00:07:28.228000 audit[4460]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4447 pid=4460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666353538363663353731366434353330343363353135343638313334 Jan 28 00:07:28.228000 audit: BPF prog-id=219 op=LOAD Jan 28 00:07:28.228000 audit[4460]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4447 pid=4460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666353538363663353731366434353330343363353135343638313334 Jan 28 00:07:28.228000 audit: BPF prog-id=219 op=UNLOAD Jan 28 00:07:28.228000 audit[4460]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4447 pid=4460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666353538363663353731366434353330343363353135343638313334 Jan 28 00:07:28.228000 audit: BPF prog-id=218 op=UNLOAD Jan 28 00:07:28.228000 audit[4460]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4447 pid=4460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666353538363663353731366434353330343363353135343638313334 Jan 28 00:07:28.228000 audit: BPF prog-id=220 op=LOAD Jan 28 00:07:28.228000 audit[4460]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4447 pid=4460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:28.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3666353538363663353731366434353330343363353135343638313334 Jan 28 00:07:28.250782 containerd[1660]: time="2026-01-28T00:07:28.250746233Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-564d6bbdf8-z4ccx,Uid:c30f0bb9-9acc-4a29-aba6-a8788797d9e8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"6f55866c5716d453043c515468134adc2f45effffa60e5a98f0407e77c34fb27\"" Jan 28 00:07:28.508247 containerd[1660]: time="2026-01-28T00:07:28.507799853Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:07:28.509193 containerd[1660]: time="2026-01-28T00:07:28.509096937Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 28 00:07:28.509193 containerd[1660]: time="2026-01-28T00:07:28.509161018Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 28 00:07:28.509403 kubelet[2911]: E0128 00:07:28.509346 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 00:07:28.509403 kubelet[2911]: E0128 00:07:28.509400 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 00:07:28.509743 kubelet[2911]: E0128 00:07:28.509683 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nkhp5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-579c57b986-v9zhd_calico-system(7d3bf2f0-73b0-413a-909b-a8ef20ea0438): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 28 00:07:28.509848 containerd[1660]: time="2026-01-28T00:07:28.509819580Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 00:07:28.510876 kubelet[2911]: E0128 00:07:28.510848 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" podUID="7d3bf2f0-73b0-413a-909b-a8ef20ea0438" Jan 28 00:07:28.855481 containerd[1660]: time="2026-01-28T00:07:28.855399709Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:07:28.857437 containerd[1660]: time="2026-01-28T00:07:28.857400155Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 00:07:28.857518 containerd[1660]: time="2026-01-28T00:07:28.857481196Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 00:07:28.857734 kubelet[2911]: E0128 00:07:28.857694 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:07:28.857787 kubelet[2911]: E0128 00:07:28.857743 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:07:28.857915 kubelet[2911]: E0128 00:07:28.857866 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lvvxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-564d6bbdf8-z4ccx_calico-apiserver(c30f0bb9-9acc-4a29-aba6-a8788797d9e8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 00:07:28.859326 kubelet[2911]: E0128 00:07:28.859284 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:07:29.047944 kubelet[2911]: E0128 00:07:29.047907 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" podUID="7d3bf2f0-73b0-413a-909b-a8ef20ea0438" Jan 28 00:07:29.050532 kubelet[2911]: E0128 00:07:29.050236 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:07:29.098000 audit[4486]: NETFILTER_CFG table=filter:127 family=2 entries=20 op=nft_register_rule pid=4486 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:29.098000 audit[4486]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffff13ad970 a2=0 a3=1 items=0 ppid=3024 pid=4486 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:29.098000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:29.107000 audit[4486]: NETFILTER_CFG table=nat:128 family=2 entries=14 op=nft_register_rule pid=4486 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:29.107000 audit[4486]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=fffff13ad970 a2=0 a3=1 items=0 ppid=3024 pid=4486 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:29.107000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:29.365381 systemd-networkd[1576]: cali4c1680d9ba8: Gained IPv6LL Jan 28 00:07:29.928932 containerd[1660]: time="2026-01-28T00:07:29.928716969Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-4g849,Uid:115118f6-6ada-4444-93ef-3a99eaaacd44,Namespace:calico-system,Attempt:0,}" Jan 28 00:07:29.928932 containerd[1660]: time="2026-01-28T00:07:29.928852810Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gql7d,Uid:33c364a8-976a-4a66-b401-5e5e3f5c6aca,Namespace:calico-system,Attempt:0,}" Jan 28 00:07:29.928932 containerd[1660]: time="2026-01-28T00:07:29.928877770Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vcgk6,Uid:e8f7a78c-8d11-4e0b-a34b-f3ae93c9f74b,Namespace:kube-system,Attempt:0,}" Jan 28 00:07:30.006935 systemd-networkd[1576]: cali729bbc17cd1: Gained IPv6LL Jan 28 00:07:30.052401 kubelet[2911]: E0128 00:07:30.052362 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:07:30.053136 kubelet[2911]: E0128 00:07:30.052747 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" podUID="7d3bf2f0-73b0-413a-909b-a8ef20ea0438" Jan 28 00:07:30.072772 systemd-networkd[1576]: caliee4c01dd963: Link UP Jan 28 00:07:30.073870 systemd-networkd[1576]: caliee4c01dd963: Gained carrier Jan 28 00:07:30.089697 containerd[1660]: 2026-01-28 00:07:29.981 [INFO][4488] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--ea467cc685-k8s-goldmane--666569f655--4g849-eth0 goldmane-666569f655- calico-system 115118f6-6ada-4444-93ef-3a99eaaacd44 849 0 2026-01-28 00:07:05 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4593-0-0-n-ea467cc685 goldmane-666569f655-4g849 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] caliee4c01dd963 [] [] }} ContainerID="71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" Namespace="calico-system" Pod="goldmane-666569f655-4g849" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-goldmane--666569f655--4g849-" Jan 28 00:07:30.089697 containerd[1660]: 2026-01-28 00:07:29.981 [INFO][4488] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" Namespace="calico-system" Pod="goldmane-666569f655-4g849" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-goldmane--666569f655--4g849-eth0" Jan 28 00:07:30.089697 containerd[1660]: 2026-01-28 00:07:30.021 [INFO][4535] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" HandleID="k8s-pod-network.71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" Workload="ci--4593--0--0--n--ea467cc685-k8s-goldmane--666569f655--4g849-eth0" Jan 28 00:07:30.089879 containerd[1660]: 2026-01-28 00:07:30.022 [INFO][4535] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" HandleID="k8s-pod-network.71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" Workload="ci--4593--0--0--n--ea467cc685-k8s-goldmane--666569f655--4g849-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001a0830), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-n-ea467cc685", "pod":"goldmane-666569f655-4g849", "timestamp":"2026-01-28 00:07:30.021875372 +0000 UTC"}, Hostname:"ci-4593-0-0-n-ea467cc685", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 00:07:30.089879 containerd[1660]: 2026-01-28 00:07:30.022 [INFO][4535] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 00:07:30.089879 containerd[1660]: 2026-01-28 00:07:30.022 [INFO][4535] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 00:07:30.089879 containerd[1660]: 2026-01-28 00:07:30.022 [INFO][4535] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-ea467cc685' Jan 28 00:07:30.089879 containerd[1660]: 2026-01-28 00:07:30.032 [INFO][4535] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.089879 containerd[1660]: 2026-01-28 00:07:30.038 [INFO][4535] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.089879 containerd[1660]: 2026-01-28 00:07:30.046 [INFO][4535] ipam/ipam.go 511: Trying affinity for 192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.089879 containerd[1660]: 2026-01-28 00:07:30.048 [INFO][4535] ipam/ipam.go 158: Attempting to load block cidr=192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.089879 containerd[1660]: 2026-01-28 00:07:30.051 [INFO][4535] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.090058 containerd[1660]: 2026-01-28 00:07:30.051 [INFO][4535] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.115.0/26 handle="k8s-pod-network.71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.090058 containerd[1660]: 2026-01-28 00:07:30.053 [INFO][4535] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36 Jan 28 00:07:30.090058 containerd[1660]: 2026-01-28 00:07:30.060 [INFO][4535] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.115.0/26 handle="k8s-pod-network.71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.090058 containerd[1660]: 2026-01-28 00:07:30.067 [INFO][4535] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.115.4/26] block=192.168.115.0/26 handle="k8s-pod-network.71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.090058 containerd[1660]: 2026-01-28 00:07:30.067 [INFO][4535] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.115.4/26] handle="k8s-pod-network.71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.090058 containerd[1660]: 2026-01-28 00:07:30.067 [INFO][4535] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 00:07:30.090058 containerd[1660]: 2026-01-28 00:07:30.067 [INFO][4535] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.115.4/26] IPv6=[] ContainerID="71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" HandleID="k8s-pod-network.71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" Workload="ci--4593--0--0--n--ea467cc685-k8s-goldmane--666569f655--4g849-eth0" Jan 28 00:07:30.090179 containerd[1660]: 2026-01-28 00:07:30.070 [INFO][4488] cni-plugin/k8s.go 418: Populated endpoint ContainerID="71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" Namespace="calico-system" Pod="goldmane-666569f655-4g849" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-goldmane--666569f655--4g849-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--ea467cc685-k8s-goldmane--666569f655--4g849-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"115118f6-6ada-4444-93ef-3a99eaaacd44", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 0, 7, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-ea467cc685", ContainerID:"", Pod:"goldmane-666569f655-4g849", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.115.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"caliee4c01dd963", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 00:07:30.090280 containerd[1660]: 2026-01-28 00:07:30.070 [INFO][4488] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.115.4/32] ContainerID="71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" Namespace="calico-system" Pod="goldmane-666569f655-4g849" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-goldmane--666569f655--4g849-eth0" Jan 28 00:07:30.090280 containerd[1660]: 2026-01-28 00:07:30.070 [INFO][4488] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliee4c01dd963 ContainerID="71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" Namespace="calico-system" Pod="goldmane-666569f655-4g849" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-goldmane--666569f655--4g849-eth0" Jan 28 00:07:30.090280 containerd[1660]: 2026-01-28 00:07:30.074 [INFO][4488] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" Namespace="calico-system" Pod="goldmane-666569f655-4g849" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-goldmane--666569f655--4g849-eth0" Jan 28 00:07:30.090351 containerd[1660]: 2026-01-28 00:07:30.075 [INFO][4488] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" Namespace="calico-system" Pod="goldmane-666569f655-4g849" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-goldmane--666569f655--4g849-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--ea467cc685-k8s-goldmane--666569f655--4g849-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"115118f6-6ada-4444-93ef-3a99eaaacd44", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 0, 7, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-ea467cc685", ContainerID:"71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36", Pod:"goldmane-666569f655-4g849", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.115.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"caliee4c01dd963", MAC:"1a:e1:47:13:27:1d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 00:07:30.090397 containerd[1660]: 2026-01-28 00:07:30.088 [INFO][4488] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" Namespace="calico-system" Pod="goldmane-666569f655-4g849" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-goldmane--666569f655--4g849-eth0" Jan 28 00:07:30.104000 audit[4574]: NETFILTER_CFG table=filter:129 family=2 entries=48 op=nft_register_chain pid=4574 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 00:07:30.104000 audit[4574]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=26352 a0=3 a1=ffffcd0d4a40 a2=0 a3=ffff96afffa8 items=0 ppid=4075 pid=4574 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.104000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 00:07:30.113681 containerd[1660]: time="2026-01-28T00:07:30.113639051Z" level=info msg="connecting to shim 71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36" address="unix:///run/containerd/s/55de069b4293b2f63ec8a0423209ad6e8df454605e78fb8ab7f0c93aaf9ba1f1" namespace=k8s.io protocol=ttrpc version=3 Jan 28 00:07:30.143422 systemd[1]: Started cri-containerd-71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36.scope - libcontainer container 71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36. Jan 28 00:07:30.158000 audit: BPF prog-id=221 op=LOAD Jan 28 00:07:30.159000 audit: BPF prog-id=222 op=LOAD Jan 28 00:07:30.159000 audit[4595]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=4583 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.159000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731636364653034666637656461633435376663633430386639323965 Jan 28 00:07:30.159000 audit: BPF prog-id=222 op=UNLOAD Jan 28 00:07:30.159000 audit[4595]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4583 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.159000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731636364653034666637656461633435376663633430386639323965 Jan 28 00:07:30.159000 audit: BPF prog-id=223 op=LOAD Jan 28 00:07:30.159000 audit[4595]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=4583 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.159000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731636364653034666637656461633435376663633430386639323965 Jan 28 00:07:30.159000 audit: BPF prog-id=224 op=LOAD Jan 28 00:07:30.159000 audit[4595]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=4583 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.159000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731636364653034666637656461633435376663633430386639323965 Jan 28 00:07:30.159000 audit: BPF prog-id=224 op=UNLOAD Jan 28 00:07:30.159000 audit[4595]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4583 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.159000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731636364653034666637656461633435376663633430386639323965 Jan 28 00:07:30.159000 audit: BPF prog-id=223 op=UNLOAD Jan 28 00:07:30.159000 audit[4595]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4583 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.159000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731636364653034666637656461633435376663633430386639323965 Jan 28 00:07:30.159000 audit: BPF prog-id=225 op=LOAD Jan 28 00:07:30.159000 audit[4595]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=4583 pid=4595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.159000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3731636364653034666637656461633435376663633430386639323965 Jan 28 00:07:30.170440 systemd-networkd[1576]: cali30ae9e816a0: Link UP Jan 28 00:07:30.170801 systemd-networkd[1576]: cali30ae9e816a0: Gained carrier Jan 28 00:07:30.191629 containerd[1660]: 2026-01-28 00:07:29.989 [INFO][4499] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--vcgk6-eth0 coredns-674b8bbfcf- kube-system e8f7a78c-8d11-4e0b-a34b-f3ae93c9f74b 842 0 2026-01-28 00:06:51 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4593-0-0-n-ea467cc685 coredns-674b8bbfcf-vcgk6 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali30ae9e816a0 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" Namespace="kube-system" Pod="coredns-674b8bbfcf-vcgk6" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--vcgk6-" Jan 28 00:07:30.191629 containerd[1660]: 2026-01-28 00:07:29.989 [INFO][4499] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" Namespace="kube-system" Pod="coredns-674b8bbfcf-vcgk6" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--vcgk6-eth0" Jan 28 00:07:30.191629 containerd[1660]: 2026-01-28 00:07:30.023 [INFO][4542] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" HandleID="k8s-pod-network.ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" Workload="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--vcgk6-eth0" Jan 28 00:07:30.191842 containerd[1660]: 2026-01-28 00:07:30.023 [INFO][4542] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" HandleID="k8s-pod-network.ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" Workload="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--vcgk6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000136e30), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4593-0-0-n-ea467cc685", "pod":"coredns-674b8bbfcf-vcgk6", "timestamp":"2026-01-28 00:07:30.023557897 +0000 UTC"}, Hostname:"ci-4593-0-0-n-ea467cc685", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 00:07:30.191842 containerd[1660]: 2026-01-28 00:07:30.023 [INFO][4542] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 00:07:30.191842 containerd[1660]: 2026-01-28 00:07:30.067 [INFO][4542] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 00:07:30.191842 containerd[1660]: 2026-01-28 00:07:30.068 [INFO][4542] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-ea467cc685' Jan 28 00:07:30.191842 containerd[1660]: 2026-01-28 00:07:30.134 [INFO][4542] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.191842 containerd[1660]: 2026-01-28 00:07:30.139 [INFO][4542] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.191842 containerd[1660]: 2026-01-28 00:07:30.146 [INFO][4542] ipam/ipam.go 511: Trying affinity for 192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.191842 containerd[1660]: 2026-01-28 00:07:30.149 [INFO][4542] ipam/ipam.go 158: Attempting to load block cidr=192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.191842 containerd[1660]: 2026-01-28 00:07:30.151 [INFO][4542] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.192441 containerd[1660]: 2026-01-28 00:07:30.152 [INFO][4542] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.115.0/26 handle="k8s-pod-network.ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.192441 containerd[1660]: 2026-01-28 00:07:30.153 [INFO][4542] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824 Jan 28 00:07:30.192441 containerd[1660]: 2026-01-28 00:07:30.157 [INFO][4542] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.115.0/26 handle="k8s-pod-network.ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.192441 containerd[1660]: 2026-01-28 00:07:30.166 [INFO][4542] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.115.5/26] block=192.168.115.0/26 handle="k8s-pod-network.ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.192441 containerd[1660]: 2026-01-28 00:07:30.166 [INFO][4542] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.115.5/26] handle="k8s-pod-network.ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.192441 containerd[1660]: 2026-01-28 00:07:30.166 [INFO][4542] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 00:07:30.192441 containerd[1660]: 2026-01-28 00:07:30.166 [INFO][4542] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.115.5/26] IPv6=[] ContainerID="ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" HandleID="k8s-pod-network.ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" Workload="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--vcgk6-eth0" Jan 28 00:07:30.192569 containerd[1660]: 2026-01-28 00:07:30.168 [INFO][4499] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" Namespace="kube-system" Pod="coredns-674b8bbfcf-vcgk6" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--vcgk6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--vcgk6-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"e8f7a78c-8d11-4e0b-a34b-f3ae93c9f74b", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 0, 6, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-ea467cc685", ContainerID:"", Pod:"coredns-674b8bbfcf-vcgk6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.115.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali30ae9e816a0", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 00:07:30.192569 containerd[1660]: 2026-01-28 00:07:30.168 [INFO][4499] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.115.5/32] ContainerID="ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" Namespace="kube-system" Pod="coredns-674b8bbfcf-vcgk6" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--vcgk6-eth0" Jan 28 00:07:30.192569 containerd[1660]: 2026-01-28 00:07:30.168 [INFO][4499] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali30ae9e816a0 ContainerID="ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" Namespace="kube-system" Pod="coredns-674b8bbfcf-vcgk6" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--vcgk6-eth0" Jan 28 00:07:30.192569 containerd[1660]: 2026-01-28 00:07:30.170 [INFO][4499] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" Namespace="kube-system" Pod="coredns-674b8bbfcf-vcgk6" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--vcgk6-eth0" Jan 28 00:07:30.192569 containerd[1660]: 2026-01-28 00:07:30.170 [INFO][4499] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" Namespace="kube-system" Pod="coredns-674b8bbfcf-vcgk6" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--vcgk6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--vcgk6-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"e8f7a78c-8d11-4e0b-a34b-f3ae93c9f74b", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 0, 6, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-ea467cc685", ContainerID:"ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824", Pod:"coredns-674b8bbfcf-vcgk6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.115.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali30ae9e816a0", MAC:"1a:92:ff:73:52:29", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 00:07:30.192569 containerd[1660]: 2026-01-28 00:07:30.188 [INFO][4499] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" Namespace="kube-system" Pod="coredns-674b8bbfcf-vcgk6" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--vcgk6-eth0" Jan 28 00:07:30.202067 containerd[1660]: time="2026-01-28T00:07:30.202031239Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-4g849,Uid:115118f6-6ada-4444-93ef-3a99eaaacd44,Namespace:calico-system,Attempt:0,} returns sandbox id \"71ccde04ff7edac457fcc408f929e066e259b27c1ea5fbbe3394b693bfae9d36\"" Jan 28 00:07:30.203172 containerd[1660]: time="2026-01-28T00:07:30.203147243Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 28 00:07:30.211000 audit[4630]: NETFILTER_CFG table=filter:130 family=2 entries=50 op=nft_register_chain pid=4630 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 00:07:30.211000 audit[4630]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=24912 a0=3 a1=ffffdab9dc80 a2=0 a3=ffffa9c75fa8 items=0 ppid=4075 pid=4630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.211000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 00:07:30.221805 containerd[1660]: time="2026-01-28T00:07:30.221721019Z" level=info msg="connecting to shim ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824" address="unix:///run/containerd/s/51448e153780341e73c503969d849f018e1945239df471cf30ba8bea19abb2e8" namespace=k8s.io protocol=ttrpc version=3 Jan 28 00:07:30.243408 systemd[1]: Started cri-containerd-ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824.scope - libcontainer container ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824. Jan 28 00:07:30.255000 audit: BPF prog-id=226 op=LOAD Jan 28 00:07:30.256000 audit: BPF prog-id=227 op=LOAD Jan 28 00:07:30.256000 audit[4651]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4639 pid=4651 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162386263623161643939616333383132386561616133623063346331 Jan 28 00:07:30.256000 audit: BPF prog-id=227 op=UNLOAD Jan 28 00:07:30.256000 audit[4651]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4639 pid=4651 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162386263623161643939616333383132386561616133623063346331 Jan 28 00:07:30.256000 audit: BPF prog-id=228 op=LOAD Jan 28 00:07:30.256000 audit[4651]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4639 pid=4651 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162386263623161643939616333383132386561616133623063346331 Jan 28 00:07:30.256000 audit: BPF prog-id=229 op=LOAD Jan 28 00:07:30.256000 audit[4651]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4639 pid=4651 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162386263623161643939616333383132386561616133623063346331 Jan 28 00:07:30.256000 audit: BPF prog-id=229 op=UNLOAD Jan 28 00:07:30.256000 audit[4651]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4639 pid=4651 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162386263623161643939616333383132386561616133623063346331 Jan 28 00:07:30.256000 audit: BPF prog-id=228 op=UNLOAD Jan 28 00:07:30.256000 audit[4651]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4639 pid=4651 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162386263623161643939616333383132386561616133623063346331 Jan 28 00:07:30.256000 audit: BPF prog-id=230 op=LOAD Jan 28 00:07:30.256000 audit[4651]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4639 pid=4651 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.256000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162386263623161643939616333383132386561616133623063346331 Jan 28 00:07:30.274485 systemd-networkd[1576]: cali00e35555e9c: Link UP Jan 28 00:07:30.276047 systemd-networkd[1576]: cali00e35555e9c: Gained carrier Jan 28 00:07:30.286872 containerd[1660]: time="2026-01-28T00:07:30.286749537Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vcgk6,Uid:e8f7a78c-8d11-4e0b-a34b-f3ae93c9f74b,Namespace:kube-system,Attempt:0,} returns sandbox id \"ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824\"" Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:29.995 [INFO][4509] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--ea467cc685-k8s-csi--node--driver--gql7d-eth0 csi-node-driver- calico-system 33c364a8-976a-4a66-b401-5e5e3f5c6aca 751 0 2026-01-28 00:07:07 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4593-0-0-n-ea467cc685 csi-node-driver-gql7d eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali00e35555e9c [] [] }} ContainerID="8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" Namespace="calico-system" Pod="csi-node-driver-gql7d" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-csi--node--driver--gql7d-" Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:29.995 [INFO][4509] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" Namespace="calico-system" Pod="csi-node-driver-gql7d" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-csi--node--driver--gql7d-eth0" Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:30.042 [INFO][4549] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" HandleID="k8s-pod-network.8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" Workload="ci--4593--0--0--n--ea467cc685-k8s-csi--node--driver--gql7d-eth0" Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:30.043 [INFO][4549] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" HandleID="k8s-pod-network.8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" Workload="ci--4593--0--0--n--ea467cc685-k8s-csi--node--driver--gql7d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000136df0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4593-0-0-n-ea467cc685", "pod":"csi-node-driver-gql7d", "timestamp":"2026-01-28 00:07:30.042875116 +0000 UTC"}, Hostname:"ci-4593-0-0-n-ea467cc685", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:30.043 [INFO][4549] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:30.166 [INFO][4549] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:30.166 [INFO][4549] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-ea467cc685' Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:30.234 [INFO][4549] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:30.240 [INFO][4549] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:30.249 [INFO][4549] ipam/ipam.go 511: Trying affinity for 192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:30.251 [INFO][4549] ipam/ipam.go 158: Attempting to load block cidr=192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:30.253 [INFO][4549] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:30.253 [INFO][4549] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.115.0/26 handle="k8s-pod-network.8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:30.255 [INFO][4549] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725 Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:30.259 [INFO][4549] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.115.0/26 handle="k8s-pod-network.8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:30.266 [INFO][4549] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.115.6/26] block=192.168.115.0/26 handle="k8s-pod-network.8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:30.266 [INFO][4549] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.115.6/26] handle="k8s-pod-network.8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:30.266 [INFO][4549] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 00:07:30.292738 containerd[1660]: 2026-01-28 00:07:30.266 [INFO][4549] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.115.6/26] IPv6=[] ContainerID="8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" HandleID="k8s-pod-network.8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" Workload="ci--4593--0--0--n--ea467cc685-k8s-csi--node--driver--gql7d-eth0" Jan 28 00:07:30.294526 containerd[1660]: 2026-01-28 00:07:30.270 [INFO][4509] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" Namespace="calico-system" Pod="csi-node-driver-gql7d" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-csi--node--driver--gql7d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--ea467cc685-k8s-csi--node--driver--gql7d-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"33c364a8-976a-4a66-b401-5e5e3f5c6aca", ResourceVersion:"751", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 0, 7, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-ea467cc685", ContainerID:"", Pod:"csi-node-driver-gql7d", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.115.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali00e35555e9c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 00:07:30.294526 containerd[1660]: 2026-01-28 00:07:30.270 [INFO][4509] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.115.6/32] ContainerID="8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" Namespace="calico-system" Pod="csi-node-driver-gql7d" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-csi--node--driver--gql7d-eth0" Jan 28 00:07:30.294526 containerd[1660]: 2026-01-28 00:07:30.270 [INFO][4509] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali00e35555e9c ContainerID="8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" Namespace="calico-system" Pod="csi-node-driver-gql7d" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-csi--node--driver--gql7d-eth0" Jan 28 00:07:30.294526 containerd[1660]: 2026-01-28 00:07:30.275 [INFO][4509] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" Namespace="calico-system" Pod="csi-node-driver-gql7d" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-csi--node--driver--gql7d-eth0" Jan 28 00:07:30.294526 containerd[1660]: 2026-01-28 00:07:30.275 [INFO][4509] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" Namespace="calico-system" Pod="csi-node-driver-gql7d" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-csi--node--driver--gql7d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--ea467cc685-k8s-csi--node--driver--gql7d-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"33c364a8-976a-4a66-b401-5e5e3f5c6aca", ResourceVersion:"751", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 0, 7, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-ea467cc685", ContainerID:"8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725", Pod:"csi-node-driver-gql7d", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.115.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali00e35555e9c", MAC:"02:18:88:6e:da:67", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 00:07:30.294526 containerd[1660]: 2026-01-28 00:07:30.289 [INFO][4509] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" Namespace="calico-system" Pod="csi-node-driver-gql7d" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-csi--node--driver--gql7d-eth0" Jan 28 00:07:30.298002 containerd[1660]: time="2026-01-28T00:07:30.297765570Z" level=info msg="CreateContainer within sandbox \"ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 28 00:07:30.302000 audit[4686]: NETFILTER_CFG table=filter:131 family=2 entries=48 op=nft_register_chain pid=4686 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 00:07:30.302000 audit[4686]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=23124 a0=3 a1=fffff12ba8a0 a2=0 a3=ffffb98bafa8 items=0 ppid=4075 pid=4686 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.302000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 00:07:30.313564 containerd[1660]: time="2026-01-28T00:07:30.313533458Z" level=info msg="Container 5e43bd13d286b9610dfe200c981b5ddb62926109c2ccc828897fa9d4f2f259e4: CDI devices from CRI Config.CDIDevices: []" Jan 28 00:07:30.316667 containerd[1660]: time="2026-01-28T00:07:30.316613987Z" level=info msg="connecting to shim 8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725" address="unix:///run/containerd/s/92ba3dad5deb0dc8912982435f3600417d7a989b0ea2e14332aec30d9f2522cc" namespace=k8s.io protocol=ttrpc version=3 Jan 28 00:07:30.321489 containerd[1660]: time="2026-01-28T00:07:30.321451442Z" level=info msg="CreateContainer within sandbox \"ab8bcb1ad99ac38128eaaa3b0c4c13fd326351bdad71dce65ead5c1f39a14824\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5e43bd13d286b9610dfe200c981b5ddb62926109c2ccc828897fa9d4f2f259e4\"" Jan 28 00:07:30.322408 containerd[1660]: time="2026-01-28T00:07:30.322295565Z" level=info msg="StartContainer for \"5e43bd13d286b9610dfe200c981b5ddb62926109c2ccc828897fa9d4f2f259e4\"" Jan 28 00:07:30.323572 containerd[1660]: time="2026-01-28T00:07:30.323513968Z" level=info msg="connecting to shim 5e43bd13d286b9610dfe200c981b5ddb62926109c2ccc828897fa9d4f2f259e4" address="unix:///run/containerd/s/51448e153780341e73c503969d849f018e1945239df471cf30ba8bea19abb2e8" protocol=ttrpc version=3 Jan 28 00:07:30.340411 systemd[1]: Started cri-containerd-8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725.scope - libcontainer container 8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725. Jan 28 00:07:30.344060 systemd[1]: Started cri-containerd-5e43bd13d286b9610dfe200c981b5ddb62926109c2ccc828897fa9d4f2f259e4.scope - libcontainer container 5e43bd13d286b9610dfe200c981b5ddb62926109c2ccc828897fa9d4f2f259e4. Jan 28 00:07:30.350000 audit: BPF prog-id=231 op=LOAD Jan 28 00:07:30.350000 audit: BPF prog-id=232 op=LOAD Jan 28 00:07:30.350000 audit[4706]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138180 a2=98 a3=0 items=0 ppid=4695 pid=4706 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.350000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864626235613630353165373133313966646232343739333865353637 Jan 28 00:07:30.350000 audit: BPF prog-id=232 op=UNLOAD Jan 28 00:07:30.350000 audit[4706]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4695 pid=4706 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.350000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864626235613630353165373133313966646232343739333865353637 Jan 28 00:07:30.351000 audit: BPF prog-id=233 op=LOAD Jan 28 00:07:30.351000 audit[4706]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001383e8 a2=98 a3=0 items=0 ppid=4695 pid=4706 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.351000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864626235613630353165373133313966646232343739333865353637 Jan 28 00:07:30.351000 audit: BPF prog-id=234 op=LOAD Jan 28 00:07:30.351000 audit[4706]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000138168 a2=98 a3=0 items=0 ppid=4695 pid=4706 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.351000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864626235613630353165373133313966646232343739333865353637 Jan 28 00:07:30.351000 audit: BPF prog-id=234 op=UNLOAD Jan 28 00:07:30.351000 audit[4706]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4695 pid=4706 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.351000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864626235613630353165373133313966646232343739333865353637 Jan 28 00:07:30.351000 audit: BPF prog-id=233 op=UNLOAD Jan 28 00:07:30.351000 audit[4706]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4695 pid=4706 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.351000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864626235613630353165373133313966646232343739333865353637 Jan 28 00:07:30.351000 audit: BPF prog-id=235 op=LOAD Jan 28 00:07:30.351000 audit[4706]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138648 a2=98 a3=0 items=0 ppid=4695 pid=4706 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.351000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3864626235613630353165373133313966646232343739333865353637 Jan 28 00:07:30.353000 audit: BPF prog-id=236 op=LOAD Jan 28 00:07:30.354000 audit: BPF prog-id=237 op=LOAD Jan 28 00:07:30.354000 audit[4713]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400010c180 a2=98 a3=0 items=0 ppid=4639 pid=4713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.354000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565343362643133643238366239363130646665323030633938316235 Jan 28 00:07:30.354000 audit: BPF prog-id=237 op=UNLOAD Jan 28 00:07:30.354000 audit[4713]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4639 pid=4713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.354000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565343362643133643238366239363130646665323030633938316235 Jan 28 00:07:30.354000 audit: BPF prog-id=238 op=LOAD Jan 28 00:07:30.354000 audit[4713]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400010c3e8 a2=98 a3=0 items=0 ppid=4639 pid=4713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.354000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565343362643133643238366239363130646665323030633938316235 Jan 28 00:07:30.354000 audit: BPF prog-id=239 op=LOAD Jan 28 00:07:30.354000 audit[4713]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=400010c168 a2=98 a3=0 items=0 ppid=4639 pid=4713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.354000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565343362643133643238366239363130646665323030633938316235 Jan 28 00:07:30.354000 audit: BPF prog-id=239 op=UNLOAD Jan 28 00:07:30.354000 audit[4713]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4639 pid=4713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.354000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565343362643133643238366239363130646665323030633938316235 Jan 28 00:07:30.354000 audit: BPF prog-id=238 op=UNLOAD Jan 28 00:07:30.354000 audit[4713]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4639 pid=4713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.354000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565343362643133643238366239363130646665323030633938316235 Jan 28 00:07:30.354000 audit: BPF prog-id=240 op=LOAD Jan 28 00:07:30.354000 audit[4713]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400010c648 a2=98 a3=0 items=0 ppid=4639 pid=4713 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:30.354000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565343362643133643238366239363130646665323030633938316235 Jan 28 00:07:30.378235 containerd[1660]: time="2026-01-28T00:07:30.378021894Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gql7d,Uid:33c364a8-976a-4a66-b401-5e5e3f5c6aca,Namespace:calico-system,Attempt:0,} returns sandbox id \"8dbb5a6051e71319fdb247938e5671b2e3124dba35af90fa64ea09bf12fe9725\"" Jan 28 00:07:30.380806 containerd[1660]: time="2026-01-28T00:07:30.380589582Z" level=info msg="StartContainer for \"5e43bd13d286b9610dfe200c981b5ddb62926109c2ccc828897fa9d4f2f259e4\" returns successfully" Jan 28 00:07:30.551627 containerd[1660]: time="2026-01-28T00:07:30.551427701Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:07:30.553945 containerd[1660]: time="2026-01-28T00:07:30.553897988Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 28 00:07:30.554066 containerd[1660]: time="2026-01-28T00:07:30.553987708Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 28 00:07:30.554269 kubelet[2911]: E0128 00:07:30.554230 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 00:07:30.554358 kubelet[2911]: E0128 00:07:30.554343 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 00:07:30.554705 containerd[1660]: time="2026-01-28T00:07:30.554681470Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 28 00:07:30.554842 kubelet[2911]: E0128 00:07:30.554724 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dh76x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-4g849_calico-system(115118f6-6ada-4444-93ef-3a99eaaacd44): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 28 00:07:30.556493 kubelet[2911]: E0128 00:07:30.556324 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:07:30.892037 containerd[1660]: time="2026-01-28T00:07:30.891932655Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:07:30.893535 containerd[1660]: time="2026-01-28T00:07:30.893501060Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 28 00:07:30.893701 containerd[1660]: time="2026-01-28T00:07:30.893590820Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 28 00:07:30.893735 kubelet[2911]: E0128 00:07:30.893696 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 00:07:30.893769 kubelet[2911]: E0128 00:07:30.893742 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 00:07:30.893939 kubelet[2911]: E0128 00:07:30.893875 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qs5cv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-gql7d_calico-system(33c364a8-976a-4a66-b401-5e5e3f5c6aca): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 28 00:07:30.896931 containerd[1660]: time="2026-01-28T00:07:30.896763429Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 28 00:07:30.929112 containerd[1660]: time="2026-01-28T00:07:30.929058728Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-6kv44,Uid:88283619-c3e2-4b73-b5a3-401f940acb8e,Namespace:kube-system,Attempt:0,}" Jan 28 00:07:31.027438 systemd-networkd[1576]: cali4a9dbb6d91b: Link UP Jan 28 00:07:31.028054 systemd-networkd[1576]: cali4a9dbb6d91b: Gained carrier Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:30.964 [INFO][4767] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--6kv44-eth0 coredns-674b8bbfcf- kube-system 88283619-c3e2-4b73-b5a3-401f940acb8e 848 0 2026-01-28 00:06:51 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4593-0-0-n-ea467cc685 coredns-674b8bbfcf-6kv44 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4a9dbb6d91b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" Namespace="kube-system" Pod="coredns-674b8bbfcf-6kv44" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--6kv44-" Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:30.964 [INFO][4767] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" Namespace="kube-system" Pod="coredns-674b8bbfcf-6kv44" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--6kv44-eth0" Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:30.986 [INFO][4781] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" HandleID="k8s-pod-network.ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" Workload="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--6kv44-eth0" Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:30.986 [INFO][4781] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" HandleID="k8s-pod-network.ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" Workload="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--6kv44-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d730), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4593-0-0-n-ea467cc685", "pod":"coredns-674b8bbfcf-6kv44", "timestamp":"2026-01-28 00:07:30.986589822 +0000 UTC"}, Hostname:"ci-4593-0-0-n-ea467cc685", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:30.986 [INFO][4781] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:30.986 [INFO][4781] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:30.986 [INFO][4781] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-ea467cc685' Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:30.996 [INFO][4781] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:31.000 [INFO][4781] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:31.004 [INFO][4781] ipam/ipam.go 511: Trying affinity for 192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:31.005 [INFO][4781] ipam/ipam.go 158: Attempting to load block cidr=192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:31.008 [INFO][4781] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:31.008 [INFO][4781] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.115.0/26 handle="k8s-pod-network.ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:31.010 [INFO][4781] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:31.014 [INFO][4781] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.115.0/26 handle="k8s-pod-network.ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:31.022 [INFO][4781] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.115.7/26] block=192.168.115.0/26 handle="k8s-pod-network.ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:31.022 [INFO][4781] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.115.7/26] handle="k8s-pod-network.ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:31.023 [INFO][4781] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 00:07:31.042059 containerd[1660]: 2026-01-28 00:07:31.023 [INFO][4781] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.115.7/26] IPv6=[] ContainerID="ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" HandleID="k8s-pod-network.ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" Workload="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--6kv44-eth0" Jan 28 00:07:31.042612 containerd[1660]: 2026-01-28 00:07:31.024 [INFO][4767] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" Namespace="kube-system" Pod="coredns-674b8bbfcf-6kv44" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--6kv44-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--6kv44-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"88283619-c3e2-4b73-b5a3-401f940acb8e", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 0, 6, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-ea467cc685", ContainerID:"", Pod:"coredns-674b8bbfcf-6kv44", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.115.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4a9dbb6d91b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 00:07:31.042612 containerd[1660]: 2026-01-28 00:07:31.025 [INFO][4767] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.115.7/32] ContainerID="ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" Namespace="kube-system" Pod="coredns-674b8bbfcf-6kv44" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--6kv44-eth0" Jan 28 00:07:31.042612 containerd[1660]: 2026-01-28 00:07:31.025 [INFO][4767] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4a9dbb6d91b ContainerID="ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" Namespace="kube-system" Pod="coredns-674b8bbfcf-6kv44" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--6kv44-eth0" Jan 28 00:07:31.042612 containerd[1660]: 2026-01-28 00:07:31.028 [INFO][4767] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" Namespace="kube-system" Pod="coredns-674b8bbfcf-6kv44" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--6kv44-eth0" Jan 28 00:07:31.042612 containerd[1660]: 2026-01-28 00:07:31.028 [INFO][4767] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" Namespace="kube-system" Pod="coredns-674b8bbfcf-6kv44" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--6kv44-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--6kv44-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"88283619-c3e2-4b73-b5a3-401f940acb8e", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 0, 6, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-ea467cc685", ContainerID:"ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a", Pod:"coredns-674b8bbfcf-6kv44", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.115.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4a9dbb6d91b", MAC:"6a:a8:b4:ff:bf:64", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 00:07:31.042612 containerd[1660]: 2026-01-28 00:07:31.040 [INFO][4767] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" Namespace="kube-system" Pod="coredns-674b8bbfcf-6kv44" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-coredns--674b8bbfcf--6kv44-eth0" Jan 28 00:07:31.053000 audit[4798]: NETFILTER_CFG table=filter:132 family=2 entries=36 op=nft_register_chain pid=4798 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 00:07:31.053000 audit[4798]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19176 a0=3 a1=ffffd1dfc6d0 a2=0 a3=ffffa4154fa8 items=0 ppid=4075 pid=4798 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.053000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 00:07:31.066249 kubelet[2911]: E0128 00:07:31.064790 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:07:31.072483 containerd[1660]: time="2026-01-28T00:07:31.072441163Z" level=info msg="connecting to shim ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a" address="unix:///run/containerd/s/2ce02fc7870ed2d22203c6e1e41bc5796832668c4f67704365a2ab2389135f6e" namespace=k8s.io protocol=ttrpc version=3 Jan 28 00:07:31.094134 kubelet[2911]: I0128 00:07:31.094068 2911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-vcgk6" podStartSLOduration=40.094053189 podStartE2EDuration="40.094053189s" podCreationTimestamp="2026-01-28 00:06:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 00:07:31.076649056 +0000 UTC m=+46.235825353" watchObservedRunningTime="2026-01-28 00:07:31.094053189 +0000 UTC m=+46.253229446" Jan 28 00:07:31.103000 audit[4832]: NETFILTER_CFG table=filter:133 family=2 entries=20 op=nft_register_rule pid=4832 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:31.103000 audit[4832]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffee3e6900 a2=0 a3=1 items=0 ppid=3024 pid=4832 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.103000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:31.111000 audit[4832]: NETFILTER_CFG table=nat:134 family=2 entries=14 op=nft_register_rule pid=4832 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:31.111000 audit[4832]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffee3e6900 a2=0 a3=1 items=0 ppid=3024 pid=4832 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.111000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:31.115444 systemd[1]: Started cri-containerd-ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a.scope - libcontainer container ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a. Jan 28 00:07:31.125000 audit: BPF prog-id=241 op=LOAD Jan 28 00:07:31.126000 audit: BPF prog-id=242 op=LOAD Jan 28 00:07:31.126000 audit[4819]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4808 pid=4819 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.126000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564393337626163386661656532393436623330316563333330386663 Jan 28 00:07:31.127000 audit: BPF prog-id=242 op=UNLOAD Jan 28 00:07:31.127000 audit[4819]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4808 pid=4819 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.127000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564393337626163386661656532393436623330316563333330386663 Jan 28 00:07:31.128000 audit: BPF prog-id=243 op=LOAD Jan 28 00:07:31.128000 audit[4819]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4808 pid=4819 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.128000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564393337626163386661656532393436623330316563333330386663 Jan 28 00:07:31.129000 audit: BPF prog-id=244 op=LOAD Jan 28 00:07:31.129000 audit[4819]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4808 pid=4819 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564393337626163386661656532393436623330316563333330386663 Jan 28 00:07:31.129000 audit: BPF prog-id=244 op=UNLOAD Jan 28 00:07:31.129000 audit[4819]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4808 pid=4819 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564393337626163386661656532393436623330316563333330386663 Jan 28 00:07:31.129000 audit: BPF prog-id=243 op=UNLOAD Jan 28 00:07:31.129000 audit[4819]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4808 pid=4819 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564393337626163386661656532393436623330316563333330386663 Jan 28 00:07:31.129000 audit: BPF prog-id=245 op=LOAD Jan 28 00:07:31.129000 audit[4819]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4808 pid=4819 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564393337626163386661656532393436623330316563333330386663 Jan 28 00:07:31.134000 audit[4841]: NETFILTER_CFG table=filter:135 family=2 entries=17 op=nft_register_rule pid=4841 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:31.134000 audit[4841]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffebdd3210 a2=0 a3=1 items=0 ppid=3024 pid=4841 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.134000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:31.141000 audit[4841]: NETFILTER_CFG table=nat:136 family=2 entries=35 op=nft_register_chain pid=4841 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:31.141000 audit[4841]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=ffffebdd3210 a2=0 a3=1 items=0 ppid=3024 pid=4841 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.141000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:31.160469 containerd[1660]: time="2026-01-28T00:07:31.160360830Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-6kv44,Uid:88283619-c3e2-4b73-b5a3-401f940acb8e,Namespace:kube-system,Attempt:0,} returns sandbox id \"ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a\"" Jan 28 00:07:31.167146 containerd[1660]: time="2026-01-28T00:07:31.166263648Z" level=info msg="CreateContainer within sandbox \"ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 28 00:07:31.180567 containerd[1660]: time="2026-01-28T00:07:31.180342771Z" level=info msg="Container 3adb5f80ba2b3bf39da603811698de2e5ba31cb6ea87d325ed45913c70baa51a: CDI devices from CRI Config.CDIDevices: []" Jan 28 00:07:31.192157 containerd[1660]: time="2026-01-28T00:07:31.192111887Z" level=info msg="CreateContainer within sandbox \"ed937bac8faee2946b301ec3308fc691c78986a0425ba1f4ab3b1f920dba917a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3adb5f80ba2b3bf39da603811698de2e5ba31cb6ea87d325ed45913c70baa51a\"" Jan 28 00:07:31.192705 containerd[1660]: time="2026-01-28T00:07:31.192577368Z" level=info msg="StartContainer for \"3adb5f80ba2b3bf39da603811698de2e5ba31cb6ea87d325ed45913c70baa51a\"" Jan 28 00:07:31.193583 containerd[1660]: time="2026-01-28T00:07:31.193555811Z" level=info msg="connecting to shim 3adb5f80ba2b3bf39da603811698de2e5ba31cb6ea87d325ed45913c70baa51a" address="unix:///run/containerd/s/2ce02fc7870ed2d22203c6e1e41bc5796832668c4f67704365a2ab2389135f6e" protocol=ttrpc version=3 Jan 28 00:07:31.216399 systemd[1]: Started cri-containerd-3adb5f80ba2b3bf39da603811698de2e5ba31cb6ea87d325ed45913c70baa51a.scope - libcontainer container 3adb5f80ba2b3bf39da603811698de2e5ba31cb6ea87d325ed45913c70baa51a. Jan 28 00:07:31.228000 audit: BPF prog-id=246 op=LOAD Jan 28 00:07:31.229693 containerd[1660]: time="2026-01-28T00:07:31.229658161Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:07:31.228000 audit: BPF prog-id=247 op=LOAD Jan 28 00:07:31.228000 audit[4849]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=4808 pid=4849 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361646235663830626132623362663339646136303338313136393864 Jan 28 00:07:31.228000 audit: BPF prog-id=247 op=UNLOAD Jan 28 00:07:31.228000 audit[4849]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4808 pid=4849 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361646235663830626132623362663339646136303338313136393864 Jan 28 00:07:31.228000 audit: BPF prog-id=248 op=LOAD Jan 28 00:07:31.228000 audit[4849]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=4808 pid=4849 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361646235663830626132623362663339646136303338313136393864 Jan 28 00:07:31.228000 audit: BPF prog-id=249 op=LOAD Jan 28 00:07:31.228000 audit[4849]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=4808 pid=4849 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.228000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361646235663830626132623362663339646136303338313136393864 Jan 28 00:07:31.229000 audit: BPF prog-id=249 op=UNLOAD Jan 28 00:07:31.229000 audit[4849]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4808 pid=4849 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.229000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361646235663830626132623362663339646136303338313136393864 Jan 28 00:07:31.229000 audit: BPF prog-id=248 op=UNLOAD Jan 28 00:07:31.229000 audit[4849]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4808 pid=4849 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.229000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361646235663830626132623362663339646136303338313136393864 Jan 28 00:07:31.229000 audit: BPF prog-id=250 op=LOAD Jan 28 00:07:31.229000 audit[4849]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=4808 pid=4849 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:31.229000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3361646235663830626132623362663339646136303338313136393864 Jan 28 00:07:31.231534 containerd[1660]: time="2026-01-28T00:07:31.231115845Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 28 00:07:31.231534 containerd[1660]: time="2026-01-28T00:07:31.231229165Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 28 00:07:31.231638 kubelet[2911]: E0128 00:07:31.231407 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 00:07:31.231638 kubelet[2911]: E0128 00:07:31.231456 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 00:07:31.231638 kubelet[2911]: E0128 00:07:31.231576 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qs5cv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-gql7d_calico-system(33c364a8-976a-4a66-b401-5e5e3f5c6aca): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 28 00:07:31.232811 kubelet[2911]: E0128 00:07:31.232773 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:07:31.252339 containerd[1660]: time="2026-01-28T00:07:31.252213869Z" level=info msg="StartContainer for \"3adb5f80ba2b3bf39da603811698de2e5ba31cb6ea87d325ed45913c70baa51a\" returns successfully" Jan 28 00:07:31.861663 systemd-networkd[1576]: caliee4c01dd963: Gained IPv6LL Jan 28 00:07:31.928499 containerd[1660]: time="2026-01-28T00:07:31.928465363Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-564d6bbdf8-pmnrt,Uid:0bfcf205-da0c-44de-9408-ba1ddb5cd934,Namespace:calico-apiserver,Attempt:0,}" Jan 28 00:07:32.033513 systemd-networkd[1576]: califd8f3c6dfa8: Link UP Jan 28 00:07:32.035402 systemd-networkd[1576]: califd8f3c6dfa8: Gained carrier Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:31.967 [INFO][4883] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--pmnrt-eth0 calico-apiserver-564d6bbdf8- calico-apiserver 0bfcf205-da0c-44de-9408-ba1ddb5cd934 850 0 2026-01-28 00:07:00 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:564d6bbdf8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4593-0-0-n-ea467cc685 calico-apiserver-564d6bbdf8-pmnrt eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] califd8f3c6dfa8 [] [] }} ContainerID="1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" Namespace="calico-apiserver" Pod="calico-apiserver-564d6bbdf8-pmnrt" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--pmnrt-" Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:31.967 [INFO][4883] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" Namespace="calico-apiserver" Pod="calico-apiserver-564d6bbdf8-pmnrt" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--pmnrt-eth0" Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:31.989 [INFO][4896] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" HandleID="k8s-pod-network.1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" Workload="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--pmnrt-eth0" Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:31.989 [INFO][4896] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" HandleID="k8s-pod-network.1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" Workload="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--pmnrt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002dd5c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4593-0-0-n-ea467cc685", "pod":"calico-apiserver-564d6bbdf8-pmnrt", "timestamp":"2026-01-28 00:07:31.989724149 +0000 UTC"}, Hostname:"ci-4593-0-0-n-ea467cc685", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:31.989 [INFO][4896] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:31.989 [INFO][4896] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:31.990 [INFO][4896] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4593-0-0-n-ea467cc685' Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:31.998 [INFO][4896] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:32.004 [INFO][4896] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:32.008 [INFO][4896] ipam/ipam.go 511: Trying affinity for 192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:32.010 [INFO][4896] ipam/ipam.go 158: Attempting to load block cidr=192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:32.012 [INFO][4896] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.115.0/26 host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:32.012 [INFO][4896] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.115.0/26 handle="k8s-pod-network.1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:32.013 [INFO][4896] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:32.019 [INFO][4896] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.115.0/26 handle="k8s-pod-network.1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:32.027 [INFO][4896] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.115.8/26] block=192.168.115.0/26 handle="k8s-pod-network.1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:32.027 [INFO][4896] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.115.8/26] handle="k8s-pod-network.1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" host="ci-4593-0-0-n-ea467cc685" Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:32.027 [INFO][4896] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 28 00:07:32.050132 containerd[1660]: 2026-01-28 00:07:32.027 [INFO][4896] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.115.8/26] IPv6=[] ContainerID="1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" HandleID="k8s-pod-network.1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" Workload="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--pmnrt-eth0" Jan 28 00:07:32.050997 containerd[1660]: 2026-01-28 00:07:32.028 [INFO][4883] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" Namespace="calico-apiserver" Pod="calico-apiserver-564d6bbdf8-pmnrt" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--pmnrt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--pmnrt-eth0", GenerateName:"calico-apiserver-564d6bbdf8-", Namespace:"calico-apiserver", SelfLink:"", UID:"0bfcf205-da0c-44de-9408-ba1ddb5cd934", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 0, 7, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"564d6bbdf8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-ea467cc685", ContainerID:"", Pod:"calico-apiserver-564d6bbdf8-pmnrt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.115.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califd8f3c6dfa8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 00:07:32.050997 containerd[1660]: 2026-01-28 00:07:32.028 [INFO][4883] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.115.8/32] ContainerID="1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" Namespace="calico-apiserver" Pod="calico-apiserver-564d6bbdf8-pmnrt" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--pmnrt-eth0" Jan 28 00:07:32.050997 containerd[1660]: 2026-01-28 00:07:32.028 [INFO][4883] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califd8f3c6dfa8 ContainerID="1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" Namespace="calico-apiserver" Pod="calico-apiserver-564d6bbdf8-pmnrt" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--pmnrt-eth0" Jan 28 00:07:32.050997 containerd[1660]: 2026-01-28 00:07:32.036 [INFO][4883] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" Namespace="calico-apiserver" Pod="calico-apiserver-564d6bbdf8-pmnrt" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--pmnrt-eth0" Jan 28 00:07:32.050997 containerd[1660]: 2026-01-28 00:07:32.036 [INFO][4883] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" Namespace="calico-apiserver" Pod="calico-apiserver-564d6bbdf8-pmnrt" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--pmnrt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--pmnrt-eth0", GenerateName:"calico-apiserver-564d6bbdf8-", Namespace:"calico-apiserver", SelfLink:"", UID:"0bfcf205-da0c-44de-9408-ba1ddb5cd934", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2026, time.January, 28, 0, 7, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"564d6bbdf8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4593-0-0-n-ea467cc685", ContainerID:"1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a", Pod:"calico-apiserver-564d6bbdf8-pmnrt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.115.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califd8f3c6dfa8", MAC:"96:75:62:00:83:04", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 28 00:07:32.050997 containerd[1660]: 2026-01-28 00:07:32.048 [INFO][4883] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" Namespace="calico-apiserver" Pod="calico-apiserver-564d6bbdf8-pmnrt" WorkloadEndpoint="ci--4593--0--0--n--ea467cc685-k8s-calico--apiserver--564d6bbdf8--pmnrt-eth0" Jan 28 00:07:32.053448 systemd-networkd[1576]: cali30ae9e816a0: Gained IPv6LL Jan 28 00:07:32.062000 audit[4914]: NETFILTER_CFG table=filter:137 family=2 entries=53 op=nft_register_chain pid=4914 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 28 00:07:32.062000 audit[4914]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=26624 a0=3 a1=ffffd4f15ec0 a2=0 a3=ffff879e5fa8 items=0 ppid=4075 pid=4914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:32.062000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 28 00:07:32.070636 kubelet[2911]: E0128 00:07:32.070325 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:07:32.070636 kubelet[2911]: E0128 00:07:32.070456 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:07:32.087985 containerd[1660]: time="2026-01-28T00:07:32.087503646Z" level=info msg="connecting to shim 1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a" address="unix:///run/containerd/s/96e0f8ee04ecd0fbfd1d2d2126247f9c88d61d60cdaada4ea7f2a9a4949f3ced" namespace=k8s.io protocol=ttrpc version=3 Jan 28 00:07:32.101718 kubelet[2911]: I0128 00:07:32.101657 2911 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-6kv44" podStartSLOduration=41.101640009 podStartE2EDuration="41.101640009s" podCreationTimestamp="2026-01-28 00:06:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 00:07:32.084649197 +0000 UTC m=+47.243825494" watchObservedRunningTime="2026-01-28 00:07:32.101640009 +0000 UTC m=+47.260816266" Jan 28 00:07:32.107000 audit[4947]: NETFILTER_CFG table=filter:138 family=2 entries=14 op=nft_register_rule pid=4947 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:32.107000 audit[4947]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffe47d6c80 a2=0 a3=1 items=0 ppid=3024 pid=4947 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:32.107000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:32.113000 audit[4947]: NETFILTER_CFG table=nat:139 family=2 entries=44 op=nft_register_rule pid=4947 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:32.113000 audit[4947]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=ffffe47d6c80 a2=0 a3=1 items=0 ppid=3024 pid=4947 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:32.113000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:32.119406 systemd[1]: Started cri-containerd-1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a.scope - libcontainer container 1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a. Jan 28 00:07:32.128000 audit: BPF prog-id=251 op=LOAD Jan 28 00:07:32.129000 audit: BPF prog-id=252 op=LOAD Jan 28 00:07:32.129000 audit[4936]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138180 a2=98 a3=0 items=0 ppid=4923 pid=4936 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:32.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3166373833393232663035373030636163643634626330653034393830 Jan 28 00:07:32.129000 audit: BPF prog-id=252 op=UNLOAD Jan 28 00:07:32.129000 audit[4936]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4923 pid=4936 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:32.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3166373833393232663035373030636163643634626330653034393830 Jan 28 00:07:32.129000 audit: BPF prog-id=253 op=LOAD Jan 28 00:07:32.129000 audit[4936]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001383e8 a2=98 a3=0 items=0 ppid=4923 pid=4936 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:32.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3166373833393232663035373030636163643634626330653034393830 Jan 28 00:07:32.129000 audit: BPF prog-id=254 op=LOAD Jan 28 00:07:32.129000 audit[4936]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000138168 a2=98 a3=0 items=0 ppid=4923 pid=4936 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:32.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3166373833393232663035373030636163643634626330653034393830 Jan 28 00:07:32.129000 audit: BPF prog-id=254 op=UNLOAD Jan 28 00:07:32.129000 audit[4936]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4923 pid=4936 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:32.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3166373833393232663035373030636163643634626330653034393830 Jan 28 00:07:32.129000 audit: BPF prog-id=253 op=UNLOAD Jan 28 00:07:32.129000 audit[4936]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4923 pid=4936 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:32.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3166373833393232663035373030636163643634626330653034393830 Jan 28 00:07:32.129000 audit: BPF prog-id=255 op=LOAD Jan 28 00:07:32.129000 audit[4936]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138648 a2=98 a3=0 items=0 ppid=4923 pid=4936 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:32.129000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3166373833393232663035373030636163643634626330653034393830 Jan 28 00:07:32.154395 containerd[1660]: time="2026-01-28T00:07:32.154346169Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-564d6bbdf8-pmnrt,Uid:0bfcf205-da0c-44de-9408-ba1ddb5cd934,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"1f783922f05700cacd64bc0e04980d4db3fed38676926764e907939542a8ad3a\"" Jan 28 00:07:32.155754 containerd[1660]: time="2026-01-28T00:07:32.155732733Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 00:07:32.245488 systemd-networkd[1576]: cali00e35555e9c: Gained IPv6LL Jan 28 00:07:32.501477 systemd-networkd[1576]: cali4a9dbb6d91b: Gained IPv6LL Jan 28 00:07:32.509948 containerd[1660]: time="2026-01-28T00:07:32.509896049Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:07:32.511300 containerd[1660]: time="2026-01-28T00:07:32.511264733Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 00:07:32.511378 containerd[1660]: time="2026-01-28T00:07:32.511289973Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 00:07:32.511538 kubelet[2911]: E0128 00:07:32.511490 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:07:32.511591 kubelet[2911]: E0128 00:07:32.511551 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:07:32.511859 kubelet[2911]: E0128 00:07:32.511701 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vj8fj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-564d6bbdf8-pmnrt_calico-apiserver(0bfcf205-da0c-44de-9408-ba1ddb5cd934): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 00:07:32.513005 kubelet[2911]: E0128 00:07:32.512973 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" podUID="0bfcf205-da0c-44de-9408-ba1ddb5cd934" Jan 28 00:07:33.071779 kubelet[2911]: E0128 00:07:33.071737 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" podUID="0bfcf205-da0c-44de-9408-ba1ddb5cd934" Jan 28 00:07:33.102000 audit[4964]: NETFILTER_CFG table=filter:140 family=2 entries=14 op=nft_register_rule pid=4964 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:33.104768 kernel: kauditd_printk_skb: 233 callbacks suppressed Jan 28 00:07:33.104848 kernel: audit: type=1325 audit(1769558853.102:742): table=filter:140 family=2 entries=14 op=nft_register_rule pid=4964 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:33.102000 audit[4964]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffc9d143b0 a2=0 a3=1 items=0 ppid=3024 pid=4964 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:33.110329 kernel: audit: type=1300 audit(1769558853.102:742): arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffc9d143b0 a2=0 a3=1 items=0 ppid=3024 pid=4964 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:33.110441 kernel: audit: type=1327 audit(1769558853.102:742): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:33.102000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:33.115000 audit[4964]: NETFILTER_CFG table=nat:141 family=2 entries=20 op=nft_register_rule pid=4964 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:33.115000 audit[4964]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffc9d143b0 a2=0 a3=1 items=0 ppid=3024 pid=4964 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:33.121747 kernel: audit: type=1325 audit(1769558853.115:743): table=nat:141 family=2 entries=20 op=nft_register_rule pid=4964 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:33.121954 kernel: audit: type=1300 audit(1769558853.115:743): arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffc9d143b0 a2=0 a3=1 items=0 ppid=3024 pid=4964 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:33.122090 kernel: audit: type=1327 audit(1769558853.115:743): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:33.115000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:33.461388 systemd-networkd[1576]: califd8f3c6dfa8: Gained IPv6LL Jan 28 00:07:34.077051 kubelet[2911]: E0128 00:07:34.076992 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" podUID="0bfcf205-da0c-44de-9408-ba1ddb5cd934" Jan 28 00:07:34.135000 audit[4966]: NETFILTER_CFG table=filter:142 family=2 entries=14 op=nft_register_rule pid=4966 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:34.135000 audit[4966]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffd16ee620 a2=0 a3=1 items=0 ppid=3024 pid=4966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:34.142635 kernel: audit: type=1325 audit(1769558854.135:744): table=filter:142 family=2 entries=14 op=nft_register_rule pid=4966 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:34.142696 kernel: audit: type=1300 audit(1769558854.135:744): arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffd16ee620 a2=0 a3=1 items=0 ppid=3024 pid=4966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:34.135000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:34.144497 kernel: audit: type=1327 audit(1769558854.135:744): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:34.152000 audit[4966]: NETFILTER_CFG table=nat:143 family=2 entries=56 op=nft_register_chain pid=4966 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:34.152000 audit[4966]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=ffffd16ee620 a2=0 a3=1 items=0 ppid=3024 pid=4966 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:07:34.152000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:07:34.156232 kernel: audit: type=1325 audit(1769558854.152:745): table=nat:143 family=2 entries=56 op=nft_register_chain pid=4966 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:07:34.626831 kubelet[2911]: I0128 00:07:34.626740 2911 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 00:07:36.930285 containerd[1660]: time="2026-01-28T00:07:36.930175715Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 28 00:07:37.272301 containerd[1660]: time="2026-01-28T00:07:37.270828550Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:07:37.273066 containerd[1660]: time="2026-01-28T00:07:37.272975596Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 28 00:07:37.273066 containerd[1660]: time="2026-01-28T00:07:37.273013956Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 28 00:07:37.273254 kubelet[2911]: E0128 00:07:37.273163 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 00:07:37.273254 kubelet[2911]: E0128 00:07:37.273223 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 00:07:37.273602 kubelet[2911]: E0128 00:07:37.273333 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:317bcf13c30d4d6c9248c8e05fdeda91,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-468nq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d74b86764-tnxb5_calico-system(944e37ab-4f3a-457c-b85f-87dff3debf4c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 28 00:07:37.275127 containerd[1660]: time="2026-01-28T00:07:37.275100882Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 28 00:07:37.599150 containerd[1660]: time="2026-01-28T00:07:37.598983026Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:07:37.600825 containerd[1660]: time="2026-01-28T00:07:37.600684871Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 28 00:07:37.600825 containerd[1660]: time="2026-01-28T00:07:37.600773752Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 28 00:07:37.600972 kubelet[2911]: E0128 00:07:37.600926 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 00:07:37.601023 kubelet[2911]: E0128 00:07:37.600974 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 00:07:37.601135 kubelet[2911]: E0128 00:07:37.601088 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-468nq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d74b86764-tnxb5_calico-system(944e37ab-4f3a-457c-b85f-87dff3debf4c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 28 00:07:37.602334 kubelet[2911]: E0128 00:07:37.602292 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d74b86764-tnxb5" podUID="944e37ab-4f3a-457c-b85f-87dff3debf4c" Jan 28 00:07:40.929929 containerd[1660]: time="2026-01-28T00:07:40.929871343Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 28 00:07:41.266894 containerd[1660]: time="2026-01-28T00:07:41.266662086Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:07:41.268505 containerd[1660]: time="2026-01-28T00:07:41.268401731Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 28 00:07:41.268505 containerd[1660]: time="2026-01-28T00:07:41.268457172Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 28 00:07:41.270452 kubelet[2911]: E0128 00:07:41.270347 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 00:07:41.270452 kubelet[2911]: E0128 00:07:41.270436 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 00:07:41.271243 kubelet[2911]: E0128 00:07:41.270946 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nkhp5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-579c57b986-v9zhd_calico-system(7d3bf2f0-73b0-413a-909b-a8ef20ea0438): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 28 00:07:41.273354 kubelet[2911]: E0128 00:07:41.273286 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" podUID="7d3bf2f0-73b0-413a-909b-a8ef20ea0438" Jan 28 00:07:41.929759 containerd[1660]: time="2026-01-28T00:07:41.929721140Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 00:07:42.256605 containerd[1660]: time="2026-01-28T00:07:42.256459852Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:07:42.258014 containerd[1660]: time="2026-01-28T00:07:42.257975937Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 00:07:42.258862 containerd[1660]: time="2026-01-28T00:07:42.258059457Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 00:07:42.259369 kubelet[2911]: E0128 00:07:42.258475 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:07:42.259369 kubelet[2911]: E0128 00:07:42.258521 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:07:42.259369 kubelet[2911]: E0128 00:07:42.258642 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lvvxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-564d6bbdf8-z4ccx_calico-apiserver(c30f0bb9-9acc-4a29-aba6-a8788797d9e8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 00:07:42.259859 kubelet[2911]: E0128 00:07:42.259811 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:07:43.930863 containerd[1660]: time="2026-01-28T00:07:43.929947655Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 28 00:07:44.269482 containerd[1660]: time="2026-01-28T00:07:44.269353406Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:07:44.271139 containerd[1660]: time="2026-01-28T00:07:44.271074851Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 28 00:07:44.271227 containerd[1660]: time="2026-01-28T00:07:44.271164532Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 28 00:07:44.271379 kubelet[2911]: E0128 00:07:44.271301 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 00:07:44.271379 kubelet[2911]: E0128 00:07:44.271353 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 00:07:44.271807 containerd[1660]: time="2026-01-28T00:07:44.271629813Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 28 00:07:44.272297 kubelet[2911]: E0128 00:07:44.272182 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qs5cv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-gql7d_calico-system(33c364a8-976a-4a66-b401-5e5e3f5c6aca): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 28 00:07:44.603231 containerd[1660]: time="2026-01-28T00:07:44.602446058Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:07:44.605068 containerd[1660]: time="2026-01-28T00:07:44.605021906Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 28 00:07:44.605141 containerd[1660]: time="2026-01-28T00:07:44.605106106Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 28 00:07:44.605332 kubelet[2911]: E0128 00:07:44.605278 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 00:07:44.605384 kubelet[2911]: E0128 00:07:44.605340 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 00:07:44.605847 kubelet[2911]: E0128 00:07:44.605704 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dh76x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-4g849_calico-system(115118f6-6ada-4444-93ef-3a99eaaacd44): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 28 00:07:44.606055 containerd[1660]: time="2026-01-28T00:07:44.606024029Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 28 00:07:44.607728 kubelet[2911]: E0128 00:07:44.607657 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:07:44.943349 containerd[1660]: time="2026-01-28T00:07:44.943146493Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:07:44.945579 containerd[1660]: time="2026-01-28T00:07:44.945536460Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 28 00:07:44.945625 containerd[1660]: time="2026-01-28T00:07:44.945571100Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 28 00:07:44.945767 kubelet[2911]: E0128 00:07:44.945729 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 00:07:44.945829 kubelet[2911]: E0128 00:07:44.945779 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 00:07:44.946231 kubelet[2911]: E0128 00:07:44.945900 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qs5cv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-gql7d_calico-system(33c364a8-976a-4a66-b401-5e5e3f5c6aca): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 28 00:07:44.947850 kubelet[2911]: E0128 00:07:44.947471 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:07:45.929184 containerd[1660]: time="2026-01-28T00:07:45.929149928Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 00:07:46.266697 containerd[1660]: time="2026-01-28T00:07:46.266273632Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:07:46.267756 containerd[1660]: time="2026-01-28T00:07:46.267719516Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 00:07:46.267839 containerd[1660]: time="2026-01-28T00:07:46.267759516Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 00:07:46.268016 kubelet[2911]: E0128 00:07:46.267982 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:07:46.268261 kubelet[2911]: E0128 00:07:46.268029 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:07:46.268261 kubelet[2911]: E0128 00:07:46.268143 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vj8fj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-564d6bbdf8-pmnrt_calico-apiserver(0bfcf205-da0c-44de-9408-ba1ddb5cd934): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 00:07:46.269531 kubelet[2911]: E0128 00:07:46.269485 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" podUID="0bfcf205-da0c-44de-9408-ba1ddb5cd934" Jan 28 00:07:51.930155 kubelet[2911]: E0128 00:07:51.930092 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" podUID="7d3bf2f0-73b0-413a-909b-a8ef20ea0438" Jan 28 00:07:51.931328 kubelet[2911]: E0128 00:07:51.930581 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d74b86764-tnxb5" podUID="944e37ab-4f3a-457c-b85f-87dff3debf4c" Jan 28 00:07:55.928920 kubelet[2911]: E0128 00:07:55.928864 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:07:58.930301 kubelet[2911]: E0128 00:07:58.930248 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:07:59.929223 kubelet[2911]: E0128 00:07:59.929159 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" podUID="0bfcf205-da0c-44de-9408-ba1ddb5cd934" Jan 28 00:07:59.930200 kubelet[2911]: E0128 00:07:59.930046 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:08:02.930158 containerd[1660]: time="2026-01-28T00:08:02.930011765Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 28 00:08:03.491622 containerd[1660]: time="2026-01-28T00:08:03.491557390Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:08:03.494458 containerd[1660]: time="2026-01-28T00:08:03.494407319Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 28 00:08:03.494536 containerd[1660]: time="2026-01-28T00:08:03.494491799Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 28 00:08:03.494703 kubelet[2911]: E0128 00:08:03.494658 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 00:08:03.495033 kubelet[2911]: E0128 00:08:03.494712 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 00:08:03.495033 kubelet[2911]: E0128 00:08:03.494920 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:317bcf13c30d4d6c9248c8e05fdeda91,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-468nq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d74b86764-tnxb5_calico-system(944e37ab-4f3a-457c-b85f-87dff3debf4c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 28 00:08:03.495227 containerd[1660]: time="2026-01-28T00:08:03.495178761Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 28 00:08:03.972177 containerd[1660]: time="2026-01-28T00:08:03.972121850Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:08:03.976043 containerd[1660]: time="2026-01-28T00:08:03.976002302Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 28 00:08:03.976547 containerd[1660]: time="2026-01-28T00:08:03.976034542Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 28 00:08:03.976684 kubelet[2911]: E0128 00:08:03.976634 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 00:08:03.976739 kubelet[2911]: E0128 00:08:03.976718 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 00:08:03.977062 kubelet[2911]: E0128 00:08:03.976963 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nkhp5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-579c57b986-v9zhd_calico-system(7d3bf2f0-73b0-413a-909b-a8ef20ea0438): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 28 00:08:03.977768 containerd[1660]: time="2026-01-28T00:08:03.977459666Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 28 00:08:03.978152 kubelet[2911]: E0128 00:08:03.978109 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" podUID="7d3bf2f0-73b0-413a-909b-a8ef20ea0438" Jan 28 00:08:04.498593 containerd[1660]: time="2026-01-28T00:08:04.498490289Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:08:04.501852 containerd[1660]: time="2026-01-28T00:08:04.501810539Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 28 00:08:04.501934 containerd[1660]: time="2026-01-28T00:08:04.501871099Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 28 00:08:04.502084 kubelet[2911]: E0128 00:08:04.502050 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 00:08:04.502511 kubelet[2911]: E0128 00:08:04.502097 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 00:08:04.502511 kubelet[2911]: E0128 00:08:04.502263 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-468nq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d74b86764-tnxb5_calico-system(944e37ab-4f3a-457c-b85f-87dff3debf4c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 28 00:08:04.503551 kubelet[2911]: E0128 00:08:04.503496 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d74b86764-tnxb5" podUID="944e37ab-4f3a-457c-b85f-87dff3debf4c" Jan 28 00:08:09.930227 containerd[1660]: time="2026-01-28T00:08:09.930174187Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 00:08:10.441359 containerd[1660]: time="2026-01-28T00:08:10.441254179Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:08:10.443065 containerd[1660]: time="2026-01-28T00:08:10.443017984Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 00:08:10.443155 containerd[1660]: time="2026-01-28T00:08:10.443035424Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 00:08:10.443354 kubelet[2911]: E0128 00:08:10.443268 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:08:10.443354 kubelet[2911]: E0128 00:08:10.443320 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:08:10.444023 kubelet[2911]: E0128 00:08:10.443474 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lvvxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-564d6bbdf8-z4ccx_calico-apiserver(c30f0bb9-9acc-4a29-aba6-a8788797d9e8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 00:08:10.444663 kubelet[2911]: E0128 00:08:10.444609 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:08:10.932253 containerd[1660]: time="2026-01-28T00:08:10.932173470Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 28 00:08:11.442233 containerd[1660]: time="2026-01-28T00:08:11.442173899Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:08:11.444794 containerd[1660]: time="2026-01-28T00:08:11.444743107Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 28 00:08:11.444862 containerd[1660]: time="2026-01-28T00:08:11.444806227Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 28 00:08:11.445010 kubelet[2911]: E0128 00:08:11.444966 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 00:08:11.445010 kubelet[2911]: E0128 00:08:11.445012 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 00:08:11.445433 kubelet[2911]: E0128 00:08:11.445135 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dh76x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-4g849_calico-system(115118f6-6ada-4444-93ef-3a99eaaacd44): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 28 00:08:11.446357 kubelet[2911]: E0128 00:08:11.446325 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:08:13.929174 containerd[1660]: time="2026-01-28T00:08:13.929135893Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 28 00:08:14.423397 containerd[1660]: time="2026-01-28T00:08:14.423348194Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:08:14.425646 containerd[1660]: time="2026-01-28T00:08:14.425582721Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 28 00:08:14.425760 containerd[1660]: time="2026-01-28T00:08:14.425680001Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 28 00:08:14.427642 kubelet[2911]: E0128 00:08:14.427394 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 00:08:14.427642 kubelet[2911]: E0128 00:08:14.427449 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 00:08:14.427642 kubelet[2911]: E0128 00:08:14.427575 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qs5cv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-gql7d_calico-system(33c364a8-976a-4a66-b401-5e5e3f5c6aca): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 28 00:08:14.429939 containerd[1660]: time="2026-01-28T00:08:14.429531133Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 28 00:08:14.952593 containerd[1660]: time="2026-01-28T00:08:14.952453321Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:08:14.956089 containerd[1660]: time="2026-01-28T00:08:14.956027812Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 28 00:08:14.956161 containerd[1660]: time="2026-01-28T00:08:14.956092412Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 28 00:08:14.956324 kubelet[2911]: E0128 00:08:14.956292 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 00:08:14.956427 kubelet[2911]: E0128 00:08:14.956410 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 00:08:14.956690 kubelet[2911]: E0128 00:08:14.956647 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qs5cv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-gql7d_calico-system(33c364a8-976a-4a66-b401-5e5e3f5c6aca): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 28 00:08:14.957024 containerd[1660]: time="2026-01-28T00:08:14.956984495Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 00:08:14.957944 kubelet[2911]: E0128 00:08:14.957915 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:08:15.478392 containerd[1660]: time="2026-01-28T00:08:15.478341438Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:08:15.481111 containerd[1660]: time="2026-01-28T00:08:15.481004926Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 00:08:15.481239 containerd[1660]: time="2026-01-28T00:08:15.481099727Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 00:08:15.481373 kubelet[2911]: E0128 00:08:15.481335 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:08:15.481636 kubelet[2911]: E0128 00:08:15.481387 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:08:15.481636 kubelet[2911]: E0128 00:08:15.481515 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vj8fj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-564d6bbdf8-pmnrt_calico-apiserver(0bfcf205-da0c-44de-9408-ba1ddb5cd934): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 00:08:15.482773 kubelet[2911]: E0128 00:08:15.482727 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" podUID="0bfcf205-da0c-44de-9408-ba1ddb5cd934" Jan 28 00:08:15.930842 kubelet[2911]: E0128 00:08:15.930761 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d74b86764-tnxb5" podUID="944e37ab-4f3a-457c-b85f-87dff3debf4c" Jan 28 00:08:17.928982 kubelet[2911]: E0128 00:08:17.928923 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" podUID="7d3bf2f0-73b0-413a-909b-a8ef20ea0438" Jan 28 00:08:22.932151 kubelet[2911]: E0128 00:08:22.931810 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:08:25.929979 kubelet[2911]: E0128 00:08:25.929647 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:08:28.931297 kubelet[2911]: E0128 00:08:28.931124 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" podUID="0bfcf205-da0c-44de-9408-ba1ddb5cd934" Jan 28 00:08:28.931905 kubelet[2911]: E0128 00:08:28.931506 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d74b86764-tnxb5" podUID="944e37ab-4f3a-457c-b85f-87dff3debf4c" Jan 28 00:08:29.930171 kubelet[2911]: E0128 00:08:29.930117 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:08:32.931030 kubelet[2911]: E0128 00:08:32.930932 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" podUID="7d3bf2f0-73b0-413a-909b-a8ef20ea0438" Jan 28 00:08:33.930548 kubelet[2911]: E0128 00:08:33.930398 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:08:36.930122 kubelet[2911]: E0128 00:08:36.930061 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:08:40.931257 kubelet[2911]: E0128 00:08:40.931006 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" podUID="0bfcf205-da0c-44de-9408-ba1ddb5cd934" Jan 28 00:08:43.930123 containerd[1660]: time="2026-01-28T00:08:43.930079016Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 28 00:08:44.472939 containerd[1660]: time="2026-01-28T00:08:44.472610703Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:08:44.475197 containerd[1660]: time="2026-01-28T00:08:44.475149871Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 28 00:08:44.475369 containerd[1660]: time="2026-01-28T00:08:44.475199351Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 28 00:08:44.475710 kubelet[2911]: E0128 00:08:44.475514 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 00:08:44.475710 kubelet[2911]: E0128 00:08:44.475558 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 00:08:44.476106 kubelet[2911]: E0128 00:08:44.476005 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:317bcf13c30d4d6c9248c8e05fdeda91,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-468nq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d74b86764-tnxb5_calico-system(944e37ab-4f3a-457c-b85f-87dff3debf4c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 28 00:08:44.477758 kubelet[2911]: E0128 00:08:44.477701 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d74b86764-tnxb5" podUID="944e37ab-4f3a-457c-b85f-87dff3debf4c" Jan 28 00:08:44.932635 kubelet[2911]: E0128 00:08:44.932568 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:08:45.930248 kubelet[2911]: E0128 00:08:45.930150 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:08:47.929134 kubelet[2911]: E0128 00:08:47.929051 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:08:47.929845 containerd[1660]: time="2026-01-28T00:08:47.929628523Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 28 00:08:48.441405 containerd[1660]: time="2026-01-28T00:08:48.441352118Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:08:48.449462 containerd[1660]: time="2026-01-28T00:08:48.449398902Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 28 00:08:48.449588 containerd[1660]: time="2026-01-28T00:08:48.449465182Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 28 00:08:48.449667 kubelet[2911]: E0128 00:08:48.449627 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 00:08:48.449715 kubelet[2911]: E0128 00:08:48.449675 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 00:08:48.449844 kubelet[2911]: E0128 00:08:48.449797 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nkhp5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-579c57b986-v9zhd_calico-system(7d3bf2f0-73b0-413a-909b-a8ef20ea0438): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 28 00:08:48.451037 kubelet[2911]: E0128 00:08:48.450980 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" podUID="7d3bf2f0-73b0-413a-909b-a8ef20ea0438" Jan 28 00:08:54.929678 kubelet[2911]: E0128 00:08:54.929636 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" podUID="0bfcf205-da0c-44de-9408-ba1ddb5cd934" Jan 28 00:08:55.929571 containerd[1660]: time="2026-01-28T00:08:55.929533422Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 28 00:08:56.436548 containerd[1660]: time="2026-01-28T00:08:56.436480442Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:08:56.438072 containerd[1660]: time="2026-01-28T00:08:56.438021286Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 28 00:08:56.438132 containerd[1660]: time="2026-01-28T00:08:56.438103246Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 28 00:08:56.438288 kubelet[2911]: E0128 00:08:56.438255 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 00:08:56.438556 kubelet[2911]: E0128 00:08:56.438301 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 00:08:56.438556 kubelet[2911]: E0128 00:08:56.438422 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qs5cv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-gql7d_calico-system(33c364a8-976a-4a66-b401-5e5e3f5c6aca): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 28 00:08:56.440348 containerd[1660]: time="2026-01-28T00:08:56.440322973Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 28 00:08:56.948886 containerd[1660]: time="2026-01-28T00:08:56.948828478Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:08:56.950290 containerd[1660]: time="2026-01-28T00:08:56.950245082Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 28 00:08:56.950503 containerd[1660]: time="2026-01-28T00:08:56.950332922Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 28 00:08:56.951170 kubelet[2911]: E0128 00:08:56.950634 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 00:08:56.951170 kubelet[2911]: E0128 00:08:56.950680 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 00:08:56.951170 kubelet[2911]: E0128 00:08:56.950802 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qs5cv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-gql7d_calico-system(33c364a8-976a-4a66-b401-5e5e3f5c6aca): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 28 00:08:56.951974 kubelet[2911]: E0128 00:08:56.951943 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:08:57.928995 containerd[1660]: time="2026-01-28T00:08:57.928926775Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 28 00:08:58.448878 containerd[1660]: time="2026-01-28T00:08:58.448786314Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:08:58.451433 containerd[1660]: time="2026-01-28T00:08:58.451385041Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 28 00:08:58.451632 containerd[1660]: time="2026-01-28T00:08:58.451469842Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 28 00:08:58.451673 kubelet[2911]: E0128 00:08:58.451607 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 00:08:58.451673 kubelet[2911]: E0128 00:08:58.451652 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 00:08:58.452021 kubelet[2911]: E0128 00:08:58.451828 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dh76x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-4g849_calico-system(115118f6-6ada-4444-93ef-3a99eaaacd44): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 28 00:08:58.453051 kubelet[2911]: E0128 00:08:58.453013 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:08:58.932805 containerd[1660]: time="2026-01-28T00:08:58.932746024Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 00:08:59.414423 containerd[1660]: time="2026-01-28T00:08:59.414374606Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:08:59.416764 containerd[1660]: time="2026-01-28T00:08:59.416719774Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 00:08:59.416847 containerd[1660]: time="2026-01-28T00:08:59.416804774Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 00:08:59.417021 kubelet[2911]: E0128 00:08:59.416984 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:08:59.417088 kubelet[2911]: E0128 00:08:59.417034 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:08:59.417594 kubelet[2911]: E0128 00:08:59.417172 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lvvxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-564d6bbdf8-z4ccx_calico-apiserver(c30f0bb9-9acc-4a29-aba6-a8788797d9e8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 00:08:59.418401 kubelet[2911]: E0128 00:08:59.418369 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:08:59.931082 containerd[1660]: time="2026-01-28T00:08:59.930412854Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 28 00:09:00.431667 containerd[1660]: time="2026-01-28T00:09:00.431586976Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:09:00.433410 containerd[1660]: time="2026-01-28T00:09:00.433376341Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 28 00:09:00.433492 containerd[1660]: time="2026-01-28T00:09:00.433450902Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 28 00:09:00.433746 kubelet[2911]: E0128 00:09:00.433639 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 00:09:00.433746 kubelet[2911]: E0128 00:09:00.433710 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 00:09:00.434251 kubelet[2911]: E0128 00:09:00.434169 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-468nq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d74b86764-tnxb5_calico-system(944e37ab-4f3a-457c-b85f-87dff3debf4c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 28 00:09:00.435445 kubelet[2911]: E0128 00:09:00.435409 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d74b86764-tnxb5" podUID="944e37ab-4f3a-457c-b85f-87dff3debf4c" Jan 28 00:09:02.930370 kubelet[2911]: E0128 00:09:02.930309 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" podUID="7d3bf2f0-73b0-413a-909b-a8ef20ea0438" Jan 28 00:09:08.933094 containerd[1660]: time="2026-01-28T00:09:08.932578756Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 00:09:09.273627 containerd[1660]: time="2026-01-28T00:09:09.273510232Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:09:09.274880 containerd[1660]: time="2026-01-28T00:09:09.274781516Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 00:09:09.274880 containerd[1660]: time="2026-01-28T00:09:09.274824836Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 00:09:09.275120 kubelet[2911]: E0128 00:09:09.275057 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:09:09.275120 kubelet[2911]: E0128 00:09:09.275106 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:09:09.275466 kubelet[2911]: E0128 00:09:09.275242 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vj8fj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-564d6bbdf8-pmnrt_calico-apiserver(0bfcf205-da0c-44de-9408-ba1ddb5cd934): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 00:09:09.276449 kubelet[2911]: E0128 00:09:09.276402 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" podUID="0bfcf205-da0c-44de-9408-ba1ddb5cd934" Jan 28 00:09:10.932132 kubelet[2911]: E0128 00:09:10.932065 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:09:10.933097 kubelet[2911]: E0128 00:09:10.933047 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:09:10.933356 kubelet[2911]: E0128 00:09:10.933306 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:09:13.795279 systemd[1]: Started sshd@7-10.0.1.105:22-4.153.228.146:56742.service - OpenSSH per-connection server daemon (4.153.228.146:56742). Jan 28 00:09:13.794000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.1.105:22-4.153.228.146:56742 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:13.796729 kernel: kauditd_printk_skb: 2 callbacks suppressed Jan 28 00:09:13.796794 kernel: audit: type=1130 audit(1769558953.794:746): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.1.105:22-4.153.228.146:56742 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:14.329000 audit[5169]: USER_ACCT pid=5169 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:14.330446 sshd[5169]: Accepted publickey for core from 4.153.228.146 port 56742 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:09:14.338252 kernel: audit: type=1101 audit(1769558954.329:747): pid=5169 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:14.338348 kernel: audit: type=1103 audit(1769558954.334:748): pid=5169 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:14.334000 audit[5169]: CRED_ACQ pid=5169 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:14.338632 sshd-session[5169]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:09:14.342250 kernel: audit: type=1006 audit(1769558954.334:749): pid=5169 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=9 res=1 Jan 28 00:09:14.342339 kernel: audit: type=1300 audit(1769558954.334:749): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd9e2fec0 a2=3 a3=0 items=0 ppid=1 pid=5169 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:14.334000 audit[5169]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd9e2fec0 a2=3 a3=0 items=0 ppid=1 pid=5169 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:14.334000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:14.348085 kernel: audit: type=1327 audit(1769558954.334:749): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:14.351351 systemd-logind[1641]: New session 9 of user core. Jan 28 00:09:14.363404 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 28 00:09:14.365000 audit[5169]: USER_START pid=5169 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:14.366000 audit[5173]: CRED_ACQ pid=5173 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:14.376226 kernel: audit: type=1105 audit(1769558954.365:750): pid=5169 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:14.376358 kernel: audit: type=1103 audit(1769558954.366:751): pid=5173 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:14.693259 sshd[5173]: Connection closed by 4.153.228.146 port 56742 Jan 28 00:09:14.693041 sshd-session[5169]: pam_unix(sshd:session): session closed for user core Jan 28 00:09:14.693000 audit[5169]: USER_END pid=5169 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:14.699522 systemd[1]: sshd@7-10.0.1.105:22-4.153.228.146:56742.service: Deactivated successfully. Jan 28 00:09:14.693000 audit[5169]: CRED_DISP pid=5169 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:14.703021 kernel: audit: type=1106 audit(1769558954.693:752): pid=5169 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:14.703095 kernel: audit: type=1104 audit(1769558954.693:753): pid=5169 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:14.701259 systemd[1]: session-9.scope: Deactivated successfully. Jan 28 00:09:14.703253 systemd-logind[1641]: Session 9 logged out. Waiting for processes to exit. Jan 28 00:09:14.698000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.1.105:22-4.153.228.146:56742 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:14.704742 systemd-logind[1641]: Removed session 9. Jan 28 00:09:15.930225 kubelet[2911]: E0128 00:09:15.930132 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d74b86764-tnxb5" podUID="944e37ab-4f3a-457c-b85f-87dff3debf4c" Jan 28 00:09:17.929593 kubelet[2911]: E0128 00:09:17.929534 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" podUID="7d3bf2f0-73b0-413a-909b-a8ef20ea0438" Jan 28 00:09:19.802000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.1.105:22-4.153.228.146:52802 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:19.803733 systemd[1]: Started sshd@8-10.0.1.105:22-4.153.228.146:52802.service - OpenSSH per-connection server daemon (4.153.228.146:52802). Jan 28 00:09:19.805066 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:09:19.805121 kernel: audit: type=1130 audit(1769558959.802:755): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.1.105:22-4.153.228.146:52802 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:20.322000 audit[5190]: USER_ACCT pid=5190 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:20.324038 sshd[5190]: Accepted publickey for core from 4.153.228.146 port 52802 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:09:20.326000 audit[5190]: CRED_ACQ pid=5190 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:20.328711 sshd-session[5190]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:09:20.330917 kernel: audit: type=1101 audit(1769558960.322:756): pid=5190 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:20.330976 kernel: audit: type=1103 audit(1769558960.326:757): pid=5190 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:20.332781 kernel: audit: type=1006 audit(1769558960.326:758): pid=5190 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Jan 28 00:09:20.332843 kernel: audit: type=1300 audit(1769558960.326:758): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc8bb3130 a2=3 a3=0 items=0 ppid=1 pid=5190 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:20.326000 audit[5190]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc8bb3130 a2=3 a3=0 items=0 ppid=1 pid=5190 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:20.326000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:20.337452 kernel: audit: type=1327 audit(1769558960.326:758): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:20.339470 systemd-logind[1641]: New session 10 of user core. Jan 28 00:09:20.345359 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 28 00:09:20.346000 audit[5190]: USER_START pid=5190 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:20.348000 audit[5194]: CRED_ACQ pid=5194 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:20.354827 kernel: audit: type=1105 audit(1769558960.346:759): pid=5190 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:20.354888 kernel: audit: type=1103 audit(1769558960.348:760): pid=5194 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:20.676519 sshd[5194]: Connection closed by 4.153.228.146 port 52802 Jan 28 00:09:20.676979 sshd-session[5190]: pam_unix(sshd:session): session closed for user core Jan 28 00:09:20.677000 audit[5190]: USER_END pid=5190 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:20.681577 systemd-logind[1641]: Session 10 logged out. Waiting for processes to exit. Jan 28 00:09:20.681820 systemd[1]: sshd@8-10.0.1.105:22-4.153.228.146:52802.service: Deactivated successfully. Jan 28 00:09:20.677000 audit[5190]: CRED_DISP pid=5190 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:20.685546 systemd[1]: session-10.scope: Deactivated successfully. Jan 28 00:09:20.686070 kernel: audit: type=1106 audit(1769558960.677:761): pid=5190 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:20.686122 kernel: audit: type=1104 audit(1769558960.677:762): pid=5190 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:20.680000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.1.105:22-4.153.228.146:52802 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:20.686990 systemd-logind[1641]: Removed session 10. Jan 28 00:09:22.929789 kubelet[2911]: E0128 00:09:22.929717 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" podUID="0bfcf205-da0c-44de-9408-ba1ddb5cd934" Jan 28 00:09:24.931725 kubelet[2911]: E0128 00:09:24.931246 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:09:24.933748 kubelet[2911]: E0128 00:09:24.933702 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:09:24.933889 kubelet[2911]: E0128 00:09:24.933807 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:09:25.785170 systemd[1]: Started sshd@9-10.0.1.105:22-4.153.228.146:39478.service - OpenSSH per-connection server daemon (4.153.228.146:39478). Jan 28 00:09:25.784000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.1.105:22-4.153.228.146:39478 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:25.788946 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:09:25.789074 kernel: audit: type=1130 audit(1769558965.784:764): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.1.105:22-4.153.228.146:39478 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:26.303000 audit[5211]: USER_ACCT pid=5211 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:26.305408 sshd[5211]: Accepted publickey for core from 4.153.228.146 port 39478 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:09:26.308000 audit[5211]: CRED_ACQ pid=5211 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:26.310413 sshd-session[5211]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:09:26.312587 kernel: audit: type=1101 audit(1769558966.303:765): pid=5211 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:26.312665 kernel: audit: type=1103 audit(1769558966.308:766): pid=5211 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:26.314665 kernel: audit: type=1006 audit(1769558966.308:767): pid=5211 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 28 00:09:26.308000 audit[5211]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc3dce390 a2=3 a3=0 items=0 ppid=1 pid=5211 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:26.318427 kernel: audit: type=1300 audit(1769558966.308:767): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc3dce390 a2=3 a3=0 items=0 ppid=1 pid=5211 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:26.308000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:26.319918 kernel: audit: type=1327 audit(1769558966.308:767): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:26.322278 systemd-logind[1641]: New session 11 of user core. Jan 28 00:09:26.331633 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 28 00:09:26.332000 audit[5211]: USER_START pid=5211 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:26.338232 kernel: audit: type=1105 audit(1769558966.332:768): pid=5211 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:26.337000 audit[5215]: CRED_ACQ pid=5215 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:26.341236 kernel: audit: type=1103 audit(1769558966.337:769): pid=5215 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:26.690969 sshd[5215]: Connection closed by 4.153.228.146 port 39478 Jan 28 00:09:26.691390 sshd-session[5211]: pam_unix(sshd:session): session closed for user core Jan 28 00:09:26.691000 audit[5211]: USER_END pid=5211 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:26.695737 systemd[1]: sshd@9-10.0.1.105:22-4.153.228.146:39478.service: Deactivated successfully. Jan 28 00:09:26.692000 audit[5211]: CRED_DISP pid=5211 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:26.697729 systemd[1]: session-11.scope: Deactivated successfully. Jan 28 00:09:26.700421 kernel: audit: type=1106 audit(1769558966.691:770): pid=5211 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:26.700510 kernel: audit: type=1104 audit(1769558966.692:771): pid=5211 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:26.700449 systemd-logind[1641]: Session 11 logged out. Waiting for processes to exit. Jan 28 00:09:26.694000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.1.105:22-4.153.228.146:39478 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:26.701312 systemd-logind[1641]: Removed session 11. Jan 28 00:09:26.800000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.1.105:22-4.153.228.146:39482 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:26.801523 systemd[1]: Started sshd@10-10.0.1.105:22-4.153.228.146:39482.service - OpenSSH per-connection server daemon (4.153.228.146:39482). Jan 28 00:09:27.325000 audit[5230]: USER_ACCT pid=5230 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:27.326614 sshd[5230]: Accepted publickey for core from 4.153.228.146 port 39482 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:09:27.326000 audit[5230]: CRED_ACQ pid=5230 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:27.326000 audit[5230]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff850d870 a2=3 a3=0 items=0 ppid=1 pid=5230 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:27.326000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:27.328199 sshd-session[5230]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:09:27.335358 systemd-logind[1641]: New session 12 of user core. Jan 28 00:09:27.344394 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 28 00:09:27.345000 audit[5230]: USER_START pid=5230 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:27.348000 audit[5234]: CRED_ACQ pid=5234 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:27.711408 sshd[5234]: Connection closed by 4.153.228.146 port 39482 Jan 28 00:09:27.710693 sshd-session[5230]: pam_unix(sshd:session): session closed for user core Jan 28 00:09:27.711000 audit[5230]: USER_END pid=5230 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:27.711000 audit[5230]: CRED_DISP pid=5230 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:27.716423 systemd-logind[1641]: Session 12 logged out. Waiting for processes to exit. Jan 28 00:09:27.716644 systemd[1]: sshd@10-10.0.1.105:22-4.153.228.146:39482.service: Deactivated successfully. Jan 28 00:09:27.715000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.1.105:22-4.153.228.146:39482 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:27.718422 systemd[1]: session-12.scope: Deactivated successfully. Jan 28 00:09:27.719856 systemd-logind[1641]: Removed session 12. Jan 28 00:09:27.811000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.1.105:22-4.153.228.146:39490 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:27.812924 systemd[1]: Started sshd@11-10.0.1.105:22-4.153.228.146:39490.service - OpenSSH per-connection server daemon (4.153.228.146:39490). Jan 28 00:09:28.337000 audit[5246]: USER_ACCT pid=5246 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:28.339441 sshd[5246]: Accepted publickey for core from 4.153.228.146 port 39490 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:09:28.339000 audit[5246]: CRED_ACQ pid=5246 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:28.339000 audit[5246]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff0739270 a2=3 a3=0 items=0 ppid=1 pid=5246 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:28.339000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:28.341006 sshd-session[5246]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:09:28.344898 systemd-logind[1641]: New session 13 of user core. Jan 28 00:09:28.351370 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 28 00:09:28.352000 audit[5246]: USER_START pid=5246 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:28.353000 audit[5250]: CRED_ACQ pid=5250 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:28.705762 sshd[5250]: Connection closed by 4.153.228.146 port 39490 Jan 28 00:09:28.706651 sshd-session[5246]: pam_unix(sshd:session): session closed for user core Jan 28 00:09:28.707000 audit[5246]: USER_END pid=5246 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:28.707000 audit[5246]: CRED_DISP pid=5246 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:28.711438 systemd[1]: sshd@11-10.0.1.105:22-4.153.228.146:39490.service: Deactivated successfully. Jan 28 00:09:28.710000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.1.105:22-4.153.228.146:39490 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:28.713146 systemd[1]: session-13.scope: Deactivated successfully. Jan 28 00:09:28.713858 systemd-logind[1641]: Session 13 logged out. Waiting for processes to exit. Jan 28 00:09:28.715120 systemd-logind[1641]: Removed session 13. Jan 28 00:09:29.930247 kubelet[2911]: E0128 00:09:29.930180 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d74b86764-tnxb5" podUID="944e37ab-4f3a-457c-b85f-87dff3debf4c" Jan 28 00:09:31.929266 kubelet[2911]: E0128 00:09:31.929184 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" podUID="7d3bf2f0-73b0-413a-909b-a8ef20ea0438" Jan 28 00:09:33.816285 systemd[1]: Started sshd@12-10.0.1.105:22-4.153.228.146:39492.service - OpenSSH per-connection server daemon (4.153.228.146:39492). Jan 28 00:09:33.815000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.1.105:22-4.153.228.146:39492 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:33.819818 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 28 00:09:33.819904 kernel: audit: type=1130 audit(1769558973.815:791): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.1.105:22-4.153.228.146:39492 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:34.332000 audit[5269]: USER_ACCT pid=5269 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:34.334394 sshd[5269]: Accepted publickey for core from 4.153.228.146 port 39492 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:09:34.336000 audit[5269]: CRED_ACQ pid=5269 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:34.339105 sshd-session[5269]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:09:34.341218 kernel: audit: type=1101 audit(1769558974.332:792): pid=5269 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:34.341290 kernel: audit: type=1103 audit(1769558974.336:793): pid=5269 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:34.341613 kernel: audit: type=1006 audit(1769558974.337:794): pid=5269 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Jan 28 00:09:34.337000 audit[5269]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe923cb70 a2=3 a3=0 items=0 ppid=1 pid=5269 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:34.343543 systemd-logind[1641]: New session 14 of user core. Jan 28 00:09:34.346549 kernel: audit: type=1300 audit(1769558974.337:794): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe923cb70 a2=3 a3=0 items=0 ppid=1 pid=5269 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:34.346666 kernel: audit: type=1327 audit(1769558974.337:794): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:34.337000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:34.352365 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 28 00:09:34.354000 audit[5269]: USER_START pid=5269 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:34.356000 audit[5273]: CRED_ACQ pid=5273 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:34.362127 kernel: audit: type=1105 audit(1769558974.354:795): pid=5269 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:34.362178 kernel: audit: type=1103 audit(1769558974.356:796): pid=5273 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:34.693661 sshd[5273]: Connection closed by 4.153.228.146 port 39492 Jan 28 00:09:34.696444 sshd-session[5269]: pam_unix(sshd:session): session closed for user core Jan 28 00:09:34.696000 audit[5269]: USER_END pid=5269 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:34.700566 systemd-logind[1641]: Session 14 logged out. Waiting for processes to exit. Jan 28 00:09:34.700802 systemd[1]: sshd@12-10.0.1.105:22-4.153.228.146:39492.service: Deactivated successfully. Jan 28 00:09:34.703963 systemd[1]: session-14.scope: Deactivated successfully. Jan 28 00:09:34.696000 audit[5269]: CRED_DISP pid=5269 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:34.707885 kernel: audit: type=1106 audit(1769558974.696:797): pid=5269 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:34.707953 kernel: audit: type=1104 audit(1769558974.696:798): pid=5269 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:34.701000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.1.105:22-4.153.228.146:39492 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:34.708004 systemd-logind[1641]: Removed session 14. Jan 28 00:09:35.929742 kubelet[2911]: E0128 00:09:35.929645 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" podUID="0bfcf205-da0c-44de-9408-ba1ddb5cd934" Jan 28 00:09:37.929992 kubelet[2911]: E0128 00:09:37.929947 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:09:38.928838 kubelet[2911]: E0128 00:09:38.928754 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:09:39.805184 systemd[1]: Started sshd@13-10.0.1.105:22-4.153.228.146:57860.service - OpenSSH per-connection server daemon (4.153.228.146:57860). Jan 28 00:09:39.804000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.1.105:22-4.153.228.146:57860 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:39.809036 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:09:39.809108 kernel: audit: type=1130 audit(1769558979.804:800): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.1.105:22-4.153.228.146:57860 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:39.931430 kubelet[2911]: E0128 00:09:39.931375 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:09:40.329000 audit[5310]: USER_ACCT pid=5310 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:40.335316 sshd[5310]: Accepted publickey for core from 4.153.228.146 port 57860 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:09:40.334000 audit[5310]: CRED_ACQ pid=5310 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:40.336147 sshd-session[5310]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:09:40.339018 kernel: audit: type=1101 audit(1769558980.329:801): pid=5310 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:40.339077 kernel: audit: type=1103 audit(1769558980.334:802): pid=5310 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:40.341077 kernel: audit: type=1006 audit(1769558980.334:803): pid=5310 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Jan 28 00:09:40.341132 kernel: audit: type=1300 audit(1769558980.334:803): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcc769d00 a2=3 a3=0 items=0 ppid=1 pid=5310 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:40.334000 audit[5310]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcc769d00 a2=3 a3=0 items=0 ppid=1 pid=5310 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:40.334000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:40.345993 kernel: audit: type=1327 audit(1769558980.334:803): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:40.350304 systemd-logind[1641]: New session 15 of user core. Jan 28 00:09:40.354401 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 28 00:09:40.356000 audit[5310]: USER_START pid=5310 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:40.361000 audit[5314]: CRED_ACQ pid=5314 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:40.364826 kernel: audit: type=1105 audit(1769558980.356:804): pid=5310 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:40.364955 kernel: audit: type=1103 audit(1769558980.361:805): pid=5314 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:40.682310 sshd[5314]: Connection closed by 4.153.228.146 port 57860 Jan 28 00:09:40.682706 sshd-session[5310]: pam_unix(sshd:session): session closed for user core Jan 28 00:09:40.683000 audit[5310]: USER_END pid=5310 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:40.683000 audit[5310]: CRED_DISP pid=5310 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:40.690947 systemd[1]: sshd@13-10.0.1.105:22-4.153.228.146:57860.service: Deactivated successfully. Jan 28 00:09:40.692059 kernel: audit: type=1106 audit(1769558980.683:806): pid=5310 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:40.692103 kernel: audit: type=1104 audit(1769558980.683:807): pid=5310 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:40.691000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.1.105:22-4.153.228.146:57860 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:40.693643 systemd[1]: session-15.scope: Deactivated successfully. Jan 28 00:09:40.695142 systemd-logind[1641]: Session 15 logged out. Waiting for processes to exit. Jan 28 00:09:40.696708 systemd-logind[1641]: Removed session 15. Jan 28 00:09:42.930342 kubelet[2911]: E0128 00:09:42.930280 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d74b86764-tnxb5" podUID="944e37ab-4f3a-457c-b85f-87dff3debf4c" Jan 28 00:09:45.789158 systemd[1]: Started sshd@14-10.0.1.105:22-4.153.228.146:48660.service - OpenSSH per-connection server daemon (4.153.228.146:48660). Jan 28 00:09:45.788000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.1.105:22-4.153.228.146:48660 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:45.790231 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:09:45.790308 kernel: audit: type=1130 audit(1769558985.788:809): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.1.105:22-4.153.228.146:48660 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:46.313000 audit[5330]: USER_ACCT pid=5330 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:46.314898 sshd[5330]: Accepted publickey for core from 4.153.228.146 port 48660 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:09:46.319286 kernel: audit: type=1101 audit(1769558986.313:810): pid=5330 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:46.319406 kernel: audit: type=1103 audit(1769558986.318:811): pid=5330 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:46.318000 audit[5330]: CRED_ACQ pid=5330 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:46.320477 sshd-session[5330]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:09:46.325166 kernel: audit: type=1006 audit(1769558986.318:812): pid=5330 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Jan 28 00:09:46.325280 kernel: audit: type=1300 audit(1769558986.318:812): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff62f3d30 a2=3 a3=0 items=0 ppid=1 pid=5330 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:46.318000 audit[5330]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff62f3d30 a2=3 a3=0 items=0 ppid=1 pid=5330 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:46.318000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:46.330935 kernel: audit: type=1327 audit(1769558986.318:812): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:46.332322 systemd-logind[1641]: New session 16 of user core. Jan 28 00:09:46.345410 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 28 00:09:46.346000 audit[5330]: USER_START pid=5330 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:46.348000 audit[5334]: CRED_ACQ pid=5334 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:46.355969 kernel: audit: type=1105 audit(1769558986.346:813): pid=5330 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:46.356044 kernel: audit: type=1103 audit(1769558986.348:814): pid=5334 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:46.662717 sshd[5334]: Connection closed by 4.153.228.146 port 48660 Jan 28 00:09:46.663524 sshd-session[5330]: pam_unix(sshd:session): session closed for user core Jan 28 00:09:46.664000 audit[5330]: USER_END pid=5330 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:46.668773 systemd[1]: sshd@14-10.0.1.105:22-4.153.228.146:48660.service: Deactivated successfully. Jan 28 00:09:46.670569 systemd[1]: session-16.scope: Deactivated successfully. Jan 28 00:09:46.664000 audit[5330]: CRED_DISP pid=5330 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:46.671856 systemd-logind[1641]: Session 16 logged out. Waiting for processes to exit. Jan 28 00:09:46.672728 systemd-logind[1641]: Removed session 16. Jan 28 00:09:46.673945 kernel: audit: type=1106 audit(1769558986.664:815): pid=5330 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:46.673995 kernel: audit: type=1104 audit(1769558986.664:816): pid=5330 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:46.667000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.1.105:22-4.153.228.146:48660 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:46.766825 systemd[1]: Started sshd@15-10.0.1.105:22-4.153.228.146:48666.service - OpenSSH per-connection server daemon (4.153.228.146:48666). Jan 28 00:09:46.765000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.1.105:22-4.153.228.146:48666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:46.931981 kubelet[2911]: E0128 00:09:46.931860 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" podUID="7d3bf2f0-73b0-413a-909b-a8ef20ea0438" Jan 28 00:09:47.275000 audit[5348]: USER_ACCT pid=5348 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:47.277509 sshd[5348]: Accepted publickey for core from 4.153.228.146 port 48666 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:09:47.277000 audit[5348]: CRED_ACQ pid=5348 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:47.277000 audit[5348]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdc551000 a2=3 a3=0 items=0 ppid=1 pid=5348 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:47.277000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:47.279152 sshd-session[5348]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:09:47.283988 systemd-logind[1641]: New session 17 of user core. Jan 28 00:09:47.295531 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 28 00:09:47.296000 audit[5348]: USER_START pid=5348 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:47.298000 audit[5352]: CRED_ACQ pid=5352 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:47.682013 sshd[5352]: Connection closed by 4.153.228.146 port 48666 Jan 28 00:09:47.683857 sshd-session[5348]: pam_unix(sshd:session): session closed for user core Jan 28 00:09:47.684000 audit[5348]: USER_END pid=5348 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:47.684000 audit[5348]: CRED_DISP pid=5348 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:47.688927 systemd-logind[1641]: Session 17 logged out. Waiting for processes to exit. Jan 28 00:09:47.689559 systemd[1]: sshd@15-10.0.1.105:22-4.153.228.146:48666.service: Deactivated successfully. Jan 28 00:09:47.688000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.1.105:22-4.153.228.146:48666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:47.693509 systemd[1]: session-17.scope: Deactivated successfully. Jan 28 00:09:47.694533 systemd-logind[1641]: Removed session 17. Jan 28 00:09:47.786000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.1.105:22-4.153.228.146:48674 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:47.788071 systemd[1]: Started sshd@16-10.0.1.105:22-4.153.228.146:48674.service - OpenSSH per-connection server daemon (4.153.228.146:48674). Jan 28 00:09:47.929515 kubelet[2911]: E0128 00:09:47.929432 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" podUID="0bfcf205-da0c-44de-9408-ba1ddb5cd934" Jan 28 00:09:48.314000 audit[5364]: USER_ACCT pid=5364 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:48.316332 sshd[5364]: Accepted publickey for core from 4.153.228.146 port 48674 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:09:48.316000 audit[5364]: CRED_ACQ pid=5364 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:48.316000 audit[5364]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc80ca680 a2=3 a3=0 items=0 ppid=1 pid=5364 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:48.316000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:48.318351 sshd-session[5364]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:09:48.330194 systemd-logind[1641]: New session 18 of user core. Jan 28 00:09:48.336448 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 28 00:09:48.340000 audit[5364]: USER_START pid=5364 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:48.342000 audit[5368]: CRED_ACQ pid=5368 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:49.094000 audit[5379]: NETFILTER_CFG table=filter:144 family=2 entries=26 op=nft_register_rule pid=5379 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:09:49.094000 audit[5379]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffe688b330 a2=0 a3=1 items=0 ppid=3024 pid=5379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:49.094000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:09:49.101000 audit[5379]: NETFILTER_CFG table=nat:145 family=2 entries=20 op=nft_register_rule pid=5379 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:09:49.101000 audit[5379]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffe688b330 a2=0 a3=1 items=0 ppid=3024 pid=5379 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:49.101000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:09:49.120000 audit[5381]: NETFILTER_CFG table=filter:146 family=2 entries=38 op=nft_register_rule pid=5381 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:09:49.120000 audit[5381]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffc9e489f0 a2=0 a3=1 items=0 ppid=3024 pid=5381 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:49.120000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:09:49.127000 audit[5381]: NETFILTER_CFG table=nat:147 family=2 entries=20 op=nft_register_rule pid=5381 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:09:49.127000 audit[5381]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffc9e489f0 a2=0 a3=1 items=0 ppid=3024 pid=5381 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:49.127000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:09:49.195171 sshd[5368]: Connection closed by 4.153.228.146 port 48674 Jan 28 00:09:49.196411 sshd-session[5364]: pam_unix(sshd:session): session closed for user core Jan 28 00:09:49.196000 audit[5364]: USER_END pid=5364 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:49.196000 audit[5364]: CRED_DISP pid=5364 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:49.200740 systemd[1]: sshd@16-10.0.1.105:22-4.153.228.146:48674.service: Deactivated successfully. Jan 28 00:09:49.200000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.1.105:22-4.153.228.146:48674 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:49.202985 systemd[1]: session-18.scope: Deactivated successfully. Jan 28 00:09:49.204658 systemd-logind[1641]: Session 18 logged out. Waiting for processes to exit. Jan 28 00:09:49.205775 systemd-logind[1641]: Removed session 18. Jan 28 00:09:49.299676 systemd[1]: Started sshd@17-10.0.1.105:22-4.153.228.146:48686.service - OpenSSH per-connection server daemon (4.153.228.146:48686). Jan 28 00:09:49.298000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.1.105:22-4.153.228.146:48686 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:49.820000 audit[5386]: USER_ACCT pid=5386 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:49.822096 sshd[5386]: Accepted publickey for core from 4.153.228.146 port 48686 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:09:49.821000 audit[5386]: CRED_ACQ pid=5386 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:49.821000 audit[5386]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdb2d3400 a2=3 a3=0 items=0 ppid=1 pid=5386 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:49.821000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:49.823671 sshd-session[5386]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:09:49.830531 systemd-logind[1641]: New session 19 of user core. Jan 28 00:09:49.840610 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 28 00:09:49.841000 audit[5386]: USER_START pid=5386 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:49.843000 audit[5390]: CRED_ACQ pid=5390 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:50.281147 sshd[5390]: Connection closed by 4.153.228.146 port 48686 Jan 28 00:09:50.281450 sshd-session[5386]: pam_unix(sshd:session): session closed for user core Jan 28 00:09:50.281000 audit[5386]: USER_END pid=5386 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:50.281000 audit[5386]: CRED_DISP pid=5386 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:50.285654 systemd[1]: sshd@17-10.0.1.105:22-4.153.228.146:48686.service: Deactivated successfully. Jan 28 00:09:50.284000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.1.105:22-4.153.228.146:48686 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:50.287484 systemd[1]: session-19.scope: Deactivated successfully. Jan 28 00:09:50.290045 systemd-logind[1641]: Session 19 logged out. Waiting for processes to exit. Jan 28 00:09:50.291314 systemd-logind[1641]: Removed session 19. Jan 28 00:09:50.385780 systemd[1]: Started sshd@18-10.0.1.105:22-4.153.228.146:48698.service - OpenSSH per-connection server daemon (4.153.228.146:48698). Jan 28 00:09:50.384000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.1.105:22-4.153.228.146:48698 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:50.913000 audit[5402]: USER_ACCT pid=5402 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:50.915405 sshd[5402]: Accepted publickey for core from 4.153.228.146 port 48698 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:09:50.915712 kernel: kauditd_printk_skb: 47 callbacks suppressed Jan 28 00:09:50.915748 kernel: audit: type=1101 audit(1769558990.913:850): pid=5402 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:50.917000 audit[5402]: CRED_ACQ pid=5402 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:50.919785 sshd-session[5402]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:09:50.923050 kernel: audit: type=1103 audit(1769558990.917:851): pid=5402 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:50.925233 kernel: audit: type=1006 audit(1769558990.917:852): pid=5402 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Jan 28 00:09:50.917000 audit[5402]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff175a680 a2=3 a3=0 items=0 ppid=1 pid=5402 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:50.928932 kernel: audit: type=1300 audit(1769558990.917:852): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff175a680 a2=3 a3=0 items=0 ppid=1 pid=5402 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:50.917000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:50.930476 kernel: audit: type=1327 audit(1769558990.917:852): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:50.931506 kubelet[2911]: E0128 00:09:50.931375 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:09:50.931797 kubelet[2911]: E0128 00:09:50.931730 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:09:50.931797 kubelet[2911]: E0128 00:09:50.931762 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:09:50.933871 systemd-logind[1641]: New session 20 of user core. Jan 28 00:09:50.944747 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 28 00:09:50.949000 audit[5402]: USER_START pid=5402 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:50.953000 audit[5406]: CRED_ACQ pid=5406 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:50.957893 kernel: audit: type=1105 audit(1769558990.949:853): pid=5402 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:50.957981 kernel: audit: type=1103 audit(1769558990.953:854): pid=5406 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:51.267974 sshd[5406]: Connection closed by 4.153.228.146 port 48698 Jan 28 00:09:51.268167 sshd-session[5402]: pam_unix(sshd:session): session closed for user core Jan 28 00:09:51.268000 audit[5402]: USER_END pid=5402 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:51.272557 systemd[1]: sshd@18-10.0.1.105:22-4.153.228.146:48698.service: Deactivated successfully. Jan 28 00:09:51.268000 audit[5402]: CRED_DISP pid=5402 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:51.276246 systemd[1]: session-20.scope: Deactivated successfully. Jan 28 00:09:51.277004 kernel: audit: type=1106 audit(1769558991.268:855): pid=5402 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:51.277069 kernel: audit: type=1104 audit(1769558991.268:856): pid=5402 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:51.277091 kernel: audit: type=1131 audit(1769558991.271:857): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.1.105:22-4.153.228.146:48698 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:51.271000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.1.105:22-4.153.228.146:48698 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:51.280130 systemd-logind[1641]: Session 20 logged out. Waiting for processes to exit. Jan 28 00:09:51.280981 systemd-logind[1641]: Removed session 20. Jan 28 00:09:53.351000 audit[5421]: NETFILTER_CFG table=filter:148 family=2 entries=26 op=nft_register_rule pid=5421 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:09:53.351000 audit[5421]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffcd25b880 a2=0 a3=1 items=0 ppid=3024 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:53.351000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:09:53.358000 audit[5421]: NETFILTER_CFG table=nat:149 family=2 entries=104 op=nft_register_chain pid=5421 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 28 00:09:53.358000 audit[5421]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=48684 a0=3 a1=ffffcd25b880 a2=0 a3=1 items=0 ppid=3024 pid=5421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:53.358000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 28 00:09:55.930421 kubelet[2911]: E0128 00:09:55.930169 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d74b86764-tnxb5" podUID="944e37ab-4f3a-457c-b85f-87dff3debf4c" Jan 28 00:09:56.374361 systemd[1]: Started sshd@19-10.0.1.105:22-4.153.228.146:44924.service - OpenSSH per-connection server daemon (4.153.228.146:44924). Jan 28 00:09:56.373000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.1.105:22-4.153.228.146:44924 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:56.375618 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 28 00:09:56.375695 kernel: audit: type=1130 audit(1769558996.373:860): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.1.105:22-4.153.228.146:44924 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:56.903000 audit[5423]: USER_ACCT pid=5423 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:56.904477 sshd[5423]: Accepted publickey for core from 4.153.228.146 port 44924 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:09:56.906000 audit[5423]: CRED_ACQ pid=5423 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:56.908730 sshd-session[5423]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:09:56.911202 kernel: audit: type=1101 audit(1769558996.903:861): pid=5423 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:56.911274 kernel: audit: type=1103 audit(1769558996.906:862): pid=5423 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:56.911295 kernel: audit: type=1006 audit(1769558996.906:863): pid=5423 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Jan 28 00:09:56.906000 audit[5423]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffdb590c0 a2=3 a3=0 items=0 ppid=1 pid=5423 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:56.915678 systemd-logind[1641]: New session 21 of user core. Jan 28 00:09:56.917230 kernel: audit: type=1300 audit(1769558996.906:863): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffdb590c0 a2=3 a3=0 items=0 ppid=1 pid=5423 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:09:56.906000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:56.918618 kernel: audit: type=1327 audit(1769558996.906:863): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:09:56.927480 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 28 00:09:56.930000 audit[5423]: USER_START pid=5423 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:56.935000 audit[5427]: CRED_ACQ pid=5427 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:56.939688 kernel: audit: type=1105 audit(1769558996.930:864): pid=5423 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:56.939776 kernel: audit: type=1103 audit(1769558996.935:865): pid=5427 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:57.260532 sshd[5427]: Connection closed by 4.153.228.146 port 44924 Jan 28 00:09:57.260339 sshd-session[5423]: pam_unix(sshd:session): session closed for user core Jan 28 00:09:57.261000 audit[5423]: USER_END pid=5423 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:57.265591 systemd-logind[1641]: Session 21 logged out. Waiting for processes to exit. Jan 28 00:09:57.266114 systemd[1]: sshd@19-10.0.1.105:22-4.153.228.146:44924.service: Deactivated successfully. Jan 28 00:09:57.261000 audit[5423]: CRED_DISP pid=5423 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:57.267828 systemd[1]: session-21.scope: Deactivated successfully. Jan 28 00:09:57.269952 kernel: audit: type=1106 audit(1769558997.261:866): pid=5423 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:57.270042 kernel: audit: type=1104 audit(1769558997.261:867): pid=5423 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:09:57.265000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.1.105:22-4.153.228.146:44924 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:09:57.270816 systemd-logind[1641]: Removed session 21. Jan 28 00:10:00.930818 kubelet[2911]: E0128 00:10:00.930776 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" podUID="0bfcf205-da0c-44de-9408-ba1ddb5cd934" Jan 28 00:10:00.931341 kubelet[2911]: E0128 00:10:00.931052 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" podUID="7d3bf2f0-73b0-413a-909b-a8ef20ea0438" Jan 28 00:10:02.366312 systemd[1]: Started sshd@20-10.0.1.105:22-4.153.228.146:44928.service - OpenSSH per-connection server daemon (4.153.228.146:44928). Jan 28 00:10:02.365000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.1.105:22-4.153.228.146:44928 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:10:02.370622 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:10:02.370726 kernel: audit: type=1130 audit(1769559002.365:869): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.1.105:22-4.153.228.146:44928 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:10:02.889909 sshd[5440]: Accepted publickey for core from 4.153.228.146 port 44928 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:10:02.888000 audit[5440]: USER_ACCT pid=5440 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:02.892000 audit[5440]: CRED_ACQ pid=5440 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:02.894303 sshd-session[5440]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:10:02.897378 kernel: audit: type=1101 audit(1769559002.888:870): pid=5440 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:02.897439 kernel: audit: type=1103 audit(1769559002.892:871): pid=5440 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:02.900091 kernel: audit: type=1006 audit(1769559002.892:872): pid=5440 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Jan 28 00:10:02.892000 audit[5440]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffee9be40 a2=3 a3=0 items=0 ppid=1 pid=5440 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:10:02.903830 kernel: audit: type=1300 audit(1769559002.892:872): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffee9be40 a2=3 a3=0 items=0 ppid=1 pid=5440 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:10:02.892000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:10:02.905597 kernel: audit: type=1327 audit(1769559002.892:872): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:10:02.910271 systemd-logind[1641]: New session 22 of user core. Jan 28 00:10:02.918405 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 28 00:10:02.920000 audit[5440]: USER_START pid=5440 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:02.924000 audit[5444]: CRED_ACQ pid=5444 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:02.929287 kernel: audit: type=1105 audit(1769559002.920:873): pid=5440 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:02.929351 kernel: audit: type=1103 audit(1769559002.924:874): pid=5444 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:02.929814 kubelet[2911]: E0128 00:10:02.929394 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:10:03.231007 sshd[5444]: Connection closed by 4.153.228.146 port 44928 Jan 28 00:10:03.231287 sshd-session[5440]: pam_unix(sshd:session): session closed for user core Jan 28 00:10:03.231000 audit[5440]: USER_END pid=5440 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:03.235307 systemd[1]: sshd@20-10.0.1.105:22-4.153.228.146:44928.service: Deactivated successfully. Jan 28 00:10:03.231000 audit[5440]: CRED_DISP pid=5440 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:03.237645 systemd[1]: session-22.scope: Deactivated successfully. Jan 28 00:10:03.238506 systemd-logind[1641]: Session 22 logged out. Waiting for processes to exit. Jan 28 00:10:03.239596 systemd-logind[1641]: Removed session 22. Jan 28 00:10:03.239670 kernel: audit: type=1106 audit(1769559003.231:875): pid=5440 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:03.239700 kernel: audit: type=1104 audit(1769559003.231:876): pid=5440 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:03.234000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.1.105:22-4.153.228.146:44928 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:10:04.930225 kubelet[2911]: E0128 00:10:04.930102 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:10:04.931084 kubelet[2911]: E0128 00:10:04.931028 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:10:07.929843 containerd[1660]: time="2026-01-28T00:10:07.929804950Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 28 00:10:08.281435 containerd[1660]: time="2026-01-28T00:10:08.281329378Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:10:08.283541 containerd[1660]: time="2026-01-28T00:10:08.283494145Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 28 00:10:08.283662 containerd[1660]: time="2026-01-28T00:10:08.283585665Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 28 00:10:08.284340 kubelet[2911]: E0128 00:10:08.284293 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 00:10:08.284340 kubelet[2911]: E0128 00:10:08.284341 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 28 00:10:08.284656 kubelet[2911]: E0128 00:10:08.284446 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:317bcf13c30d4d6c9248c8e05fdeda91,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-468nq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d74b86764-tnxb5_calico-system(944e37ab-4f3a-457c-b85f-87dff3debf4c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 28 00:10:08.286536 kubelet[2911]: E0128 00:10:08.286488 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d74b86764-tnxb5" podUID="944e37ab-4f3a-457c-b85f-87dff3debf4c" Jan 28 00:10:08.337933 systemd[1]: Started sshd@21-10.0.1.105:22-4.153.228.146:52012.service - OpenSSH per-connection server daemon (4.153.228.146:52012). Jan 28 00:10:08.339043 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:10:08.339075 kernel: audit: type=1130 audit(1769559008.336:878): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.1.105:22-4.153.228.146:52012 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:10:08.336000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.1.105:22-4.153.228.146:52012 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:10:08.857121 sshd[5489]: Accepted publickey for core from 4.153.228.146 port 52012 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:10:08.855000 audit[5489]: USER_ACCT pid=5489 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:08.859000 audit[5489]: CRED_ACQ pid=5489 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:08.862992 sshd-session[5489]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:10:08.864003 kernel: audit: type=1101 audit(1769559008.855:879): pid=5489 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:08.864057 kernel: audit: type=1103 audit(1769559008.859:880): pid=5489 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:08.866022 kernel: audit: type=1006 audit(1769559008.861:881): pid=5489 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Jan 28 00:10:08.861000 audit[5489]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffee3afcd0 a2=3 a3=0 items=0 ppid=1 pid=5489 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:10:08.869498 kernel: audit: type=1300 audit(1769559008.861:881): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffee3afcd0 a2=3 a3=0 items=0 ppid=1 pid=5489 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:10:08.869239 systemd-logind[1641]: New session 23 of user core. Jan 28 00:10:08.861000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:10:08.870827 kernel: audit: type=1327 audit(1769559008.861:881): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:10:08.875400 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 28 00:10:08.876000 audit[5489]: USER_START pid=5489 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:08.878000 audit[5493]: CRED_ACQ pid=5493 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:08.884885 kernel: audit: type=1105 audit(1769559008.876:882): pid=5489 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:08.884933 kernel: audit: type=1103 audit(1769559008.878:883): pid=5493 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:09.205642 sshd[5493]: Connection closed by 4.153.228.146 port 52012 Jan 28 00:10:09.206532 sshd-session[5489]: pam_unix(sshd:session): session closed for user core Jan 28 00:10:09.208000 audit[5489]: USER_END pid=5489 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:09.212278 systemd[1]: sshd@21-10.0.1.105:22-4.153.228.146:52012.service: Deactivated successfully. Jan 28 00:10:09.208000 audit[5489]: CRED_DISP pid=5489 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:09.213952 systemd[1]: session-23.scope: Deactivated successfully. Jan 28 00:10:09.214700 systemd-logind[1641]: Session 23 logged out. Waiting for processes to exit. Jan 28 00:10:09.215792 systemd-logind[1641]: Removed session 23. Jan 28 00:10:09.216219 kernel: audit: type=1106 audit(1769559009.208:884): pid=5489 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:09.216262 kernel: audit: type=1104 audit(1769559009.208:885): pid=5489 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:09.211000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.1.105:22-4.153.228.146:52012 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:10:11.930512 kubelet[2911]: E0128 00:10:11.930444 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" podUID="0bfcf205-da0c-44de-9408-ba1ddb5cd934" Jan 28 00:10:13.929760 kubelet[2911]: E0128 00:10:13.929706 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:10:14.311000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.1.105:22-4.153.228.146:52020 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:10:14.312486 systemd[1]: Started sshd@22-10.0.1.105:22-4.153.228.146:52020.service - OpenSSH per-connection server daemon (4.153.228.146:52020). Jan 28 00:10:14.313452 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:10:14.313488 kernel: audit: type=1130 audit(1769559014.311:887): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.1.105:22-4.153.228.146:52020 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:10:14.839587 sshd[5508]: Accepted publickey for core from 4.153.228.146 port 52020 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:10:14.837000 audit[5508]: USER_ACCT pid=5508 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:14.845781 sshd-session[5508]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:10:14.842000 audit[5508]: CRED_ACQ pid=5508 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:14.849297 kernel: audit: type=1101 audit(1769559014.837:888): pid=5508 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:14.849362 kernel: audit: type=1103 audit(1769559014.842:889): pid=5508 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:14.851151 kernel: audit: type=1006 audit(1769559014.842:890): pid=5508 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Jan 28 00:10:14.842000 audit[5508]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe6875e40 a2=3 a3=0 items=0 ppid=1 pid=5508 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:10:14.854770 kernel: audit: type=1300 audit(1769559014.842:890): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe6875e40 a2=3 a3=0 items=0 ppid=1 pid=5508 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:10:14.853419 systemd-logind[1641]: New session 24 of user core. Jan 28 00:10:14.842000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:10:14.856436 kernel: audit: type=1327 audit(1769559014.842:890): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:10:14.859840 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 28 00:10:14.862000 audit[5508]: USER_START pid=5508 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:14.866000 audit[5512]: CRED_ACQ pid=5512 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:14.870374 kernel: audit: type=1105 audit(1769559014.862:891): pid=5508 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:14.870475 kernel: audit: type=1103 audit(1769559014.866:892): pid=5512 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:14.929364 containerd[1660]: time="2026-01-28T00:10:14.929321730Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 28 00:10:15.188458 sshd[5512]: Connection closed by 4.153.228.146 port 52020 Jan 28 00:10:15.188726 sshd-session[5508]: pam_unix(sshd:session): session closed for user core Jan 28 00:10:15.189000 audit[5508]: USER_END pid=5508 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:15.194114 systemd[1]: sshd@22-10.0.1.105:22-4.153.228.146:52020.service: Deactivated successfully. Jan 28 00:10:15.190000 audit[5508]: CRED_DISP pid=5508 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:15.196971 systemd[1]: session-24.scope: Deactivated successfully. Jan 28 00:10:15.198332 systemd-logind[1641]: Session 24 logged out. Waiting for processes to exit. Jan 28 00:10:15.198821 kernel: audit: type=1106 audit(1769559015.189:893): pid=5508 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:15.198878 kernel: audit: type=1104 audit(1769559015.190:894): pid=5508 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:15.194000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.1.105:22-4.153.228.146:52020 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:10:15.200448 systemd-logind[1641]: Removed session 24. Jan 28 00:10:15.295255 containerd[1660]: time="2026-01-28T00:10:15.295124521Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:10:15.296666 containerd[1660]: time="2026-01-28T00:10:15.296567326Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 28 00:10:15.296769 containerd[1660]: time="2026-01-28T00:10:15.296611606Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 28 00:10:15.296941 kubelet[2911]: E0128 00:10:15.296905 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 00:10:15.298161 kubelet[2911]: E0128 00:10:15.296954 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 28 00:10:15.298161 kubelet[2911]: E0128 00:10:15.297087 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nkhp5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-579c57b986-v9zhd_calico-system(7d3bf2f0-73b0-413a-909b-a8ef20ea0438): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 28 00:10:15.299097 kubelet[2911]: E0128 00:10:15.299053 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" podUID="7d3bf2f0-73b0-413a-909b-a8ef20ea0438" Jan 28 00:10:16.930036 kubelet[2911]: E0128 00:10:16.929972 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:10:18.932406 containerd[1660]: time="2026-01-28T00:10:18.932322289Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 28 00:10:19.283432 containerd[1660]: time="2026-01-28T00:10:19.283120154Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:10:19.284879 containerd[1660]: time="2026-01-28T00:10:19.284783759Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 28 00:10:19.284879 containerd[1660]: time="2026-01-28T00:10:19.284840679Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 28 00:10:19.285014 kubelet[2911]: E0128 00:10:19.284971 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 00:10:19.285551 kubelet[2911]: E0128 00:10:19.285021 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 28 00:10:19.285551 kubelet[2911]: E0128 00:10:19.285145 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qs5cv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-gql7d_calico-system(33c364a8-976a-4a66-b401-5e5e3f5c6aca): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 28 00:10:19.286890 containerd[1660]: time="2026-01-28T00:10:19.286853166Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 28 00:10:19.663696 containerd[1660]: time="2026-01-28T00:10:19.663439309Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:10:19.665185 containerd[1660]: time="2026-01-28T00:10:19.665150635Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 28 00:10:19.665365 containerd[1660]: time="2026-01-28T00:10:19.665181555Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 28 00:10:19.665771 kubelet[2911]: E0128 00:10:19.665530 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 00:10:19.665771 kubelet[2911]: E0128 00:10:19.665580 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 28 00:10:19.665771 kubelet[2911]: E0128 00:10:19.665707 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qs5cv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-gql7d_calico-system(33c364a8-976a-4a66-b401-5e5e3f5c6aca): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 28 00:10:19.666932 kubelet[2911]: E0128 00:10:19.666875 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:10:20.300797 systemd[1]: Started sshd@23-10.0.1.105:22-4.153.228.146:47760.service - OpenSSH per-connection server daemon (4.153.228.146:47760). Jan 28 00:10:20.299000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.1.105:22-4.153.228.146:47760 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:10:20.305619 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 28 00:10:20.305687 kernel: audit: type=1130 audit(1769559020.299:896): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.1.105:22-4.153.228.146:47760 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:10:20.814053 sshd[5526]: Accepted publickey for core from 4.153.228.146 port 47760 ssh2: RSA SHA256:VQ/LnIPdso+lFIBoB+RjpKCsxjBLjWTEGqjx+fJSwm4 Jan 28 00:10:20.812000 audit[5526]: USER_ACCT pid=5526 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:20.818982 sshd-session[5526]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 28 00:10:20.817000 audit[5526]: CRED_ACQ pid=5526 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:20.822733 kernel: audit: type=1101 audit(1769559020.812:897): pid=5526 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:20.822809 kernel: audit: type=1103 audit(1769559020.817:898): pid=5526 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:20.825197 kernel: audit: type=1006 audit(1769559020.817:899): pid=5526 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Jan 28 00:10:20.817000 audit[5526]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffcf5af60 a2=3 a3=0 items=0 ppid=1 pid=5526 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:10:20.829186 kernel: audit: type=1300 audit(1769559020.817:899): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffcf5af60 a2=3 a3=0 items=0 ppid=1 pid=5526 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 28 00:10:20.829439 kernel: audit: type=1327 audit(1769559020.817:899): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:10:20.817000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 28 00:10:20.832652 systemd-logind[1641]: New session 25 of user core. Jan 28 00:10:20.843402 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 28 00:10:20.844000 audit[5526]: USER_START pid=5526 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:20.846000 audit[5530]: CRED_ACQ pid=5530 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:20.854052 kernel: audit: type=1105 audit(1769559020.844:900): pid=5526 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:20.854093 kernel: audit: type=1103 audit(1769559020.846:901): pid=5530 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:20.930258 containerd[1660]: time="2026-01-28T00:10:20.930049676Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 28 00:10:21.173050 sshd[5530]: Connection closed by 4.153.228.146 port 47760 Jan 28 00:10:21.173344 sshd-session[5526]: pam_unix(sshd:session): session closed for user core Jan 28 00:10:21.173000 audit[5526]: USER_END pid=5526 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:21.177263 systemd[1]: sshd@23-10.0.1.105:22-4.153.228.146:47760.service: Deactivated successfully. Jan 28 00:10:21.178980 systemd[1]: session-25.scope: Deactivated successfully. Jan 28 00:10:21.173000 audit[5526]: CRED_DISP pid=5526 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:21.182176 kernel: audit: type=1106 audit(1769559021.173:902): pid=5526 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:21.182247 kernel: audit: type=1104 audit(1769559021.173:903): pid=5526 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 28 00:10:21.182436 systemd-logind[1641]: Session 25 logged out. Waiting for processes to exit. Jan 28 00:10:21.176000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.1.105:22-4.153.228.146:47760 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 28 00:10:21.183372 systemd-logind[1641]: Removed session 25. Jan 28 00:10:21.265740 containerd[1660]: time="2026-01-28T00:10:21.265695376Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:10:21.267345 containerd[1660]: time="2026-01-28T00:10:21.267299821Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 28 00:10:21.267431 containerd[1660]: time="2026-01-28T00:10:21.267384861Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 28 00:10:21.267775 kubelet[2911]: E0128 00:10:21.267541 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 00:10:21.267775 kubelet[2911]: E0128 00:10:21.267588 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 28 00:10:21.267775 kubelet[2911]: E0128 00:10:21.267709 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-468nq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7d74b86764-tnxb5_calico-system(944e37ab-4f3a-457c-b85f-87dff3debf4c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 28 00:10:21.268938 kubelet[2911]: E0128 00:10:21.268899 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-7d74b86764-tnxb5" podUID="944e37ab-4f3a-457c-b85f-87dff3debf4c" Jan 28 00:10:22.929735 kubelet[2911]: E0128 00:10:22.929662 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-pmnrt" podUID="0bfcf205-da0c-44de-9408-ba1ddb5cd934" Jan 28 00:10:25.930965 kubelet[2911]: E0128 00:10:25.930910 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-579c57b986-v9zhd" podUID="7d3bf2f0-73b0-413a-909b-a8ef20ea0438" Jan 28 00:10:26.930310 containerd[1660]: time="2026-01-28T00:10:26.930232141Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 00:10:27.288274 containerd[1660]: time="2026-01-28T00:10:27.288014188Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:10:27.289414 containerd[1660]: time="2026-01-28T00:10:27.289378392Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 28 00:10:27.289507 containerd[1660]: time="2026-01-28T00:10:27.289448712Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 28 00:10:27.289635 kubelet[2911]: E0128 00:10:27.289595 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:10:27.289958 kubelet[2911]: E0128 00:10:27.289648 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 28 00:10:27.289958 kubelet[2911]: E0128 00:10:27.289771 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lvvxj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-564d6bbdf8-z4ccx_calico-apiserver(c30f0bb9-9acc-4a29-aba6-a8788797d9e8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 28 00:10:27.290999 kubelet[2911]: E0128 00:10:27.290971 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-564d6bbdf8-z4ccx" podUID="c30f0bb9-9acc-4a29-aba6-a8788797d9e8" Jan 28 00:10:27.929900 containerd[1660]: time="2026-01-28T00:10:27.929757497Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 28 00:10:28.268617 containerd[1660]: time="2026-01-28T00:10:28.268506246Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 28 00:10:28.269902 containerd[1660]: time="2026-01-28T00:10:28.269864810Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 28 00:10:28.270110 containerd[1660]: time="2026-01-28T00:10:28.269943250Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 28 00:10:28.270192 kubelet[2911]: E0128 00:10:28.270160 2911 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 00:10:28.270506 kubelet[2911]: E0128 00:10:28.270296 2911 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 28 00:10:28.270506 kubelet[2911]: E0128 00:10:28.270444 2911 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dh76x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-4g849_calico-system(115118f6-6ada-4444-93ef-3a99eaaacd44): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 28 00:10:28.272632 kubelet[2911]: E0128 00:10:28.272545 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-4g849" podUID="115118f6-6ada-4444-93ef-3a99eaaacd44" Jan 28 00:10:32.931107 kubelet[2911]: E0128 00:10:32.931061 2911 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-gql7d" podUID="33c364a8-976a-4a66-b401-5e5e3f5c6aca" Jan 28 00:10:33.930129 containerd[1660]: time="2026-01-28T00:10:33.930093922Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 28 00:10:34.268971 containerd[1660]: time="2026-01-28T00:10:34.268842831Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io